Feb 20 09:55:06 crc systemd[1]: Starting Kubernetes Kubelet... Feb 20 09:55:06 crc restorecon[4767]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:06 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:07 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:08 crc restorecon[4767]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 09:55:08 crc restorecon[4767]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 20 09:55:08 crc kubenswrapper[4962]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 09:55:08 crc kubenswrapper[4962]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 20 09:55:08 crc kubenswrapper[4962]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 09:55:08 crc kubenswrapper[4962]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 09:55:08 crc kubenswrapper[4962]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 20 09:55:08 crc kubenswrapper[4962]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.845274 4962 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848106 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848125 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848130 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848138 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848145 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848150 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848156 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848164 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848170 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848176 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848180 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848184 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848196 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848200 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848203 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848207 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848211 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848214 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848218 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848221 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848224 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848228 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848231 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848235 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848239 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848242 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848246 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848249 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848253 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848256 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848260 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848263 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848267 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848271 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848275 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848280 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848284 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848288 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848292 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848296 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848300 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848304 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848308 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848312 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848315 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848319 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848322 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848326 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848330 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848334 4962 feature_gate.go:330] unrecognized feature gate: Example Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848338 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848341 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848345 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848348 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848351 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848355 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848360 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848364 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848369 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848373 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848376 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848380 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.848384 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.849734 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.849747 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.849752 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.849757 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.849761 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.849766 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.849769 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.849773 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850715 4962 flags.go:64] FLAG: --address="0.0.0.0" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850734 4962 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850746 4962 flags.go:64] FLAG: --anonymous-auth="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850754 4962 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850761 4962 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850767 4962 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850775 4962 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850781 4962 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850787 4962 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850793 4962 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850798 4962 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850804 4962 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850809 4962 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850821 4962 flags.go:64] FLAG: --cgroup-root="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850827 4962 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850832 4962 flags.go:64] FLAG: --client-ca-file="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850837 4962 flags.go:64] FLAG: --cloud-config="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850842 4962 flags.go:64] FLAG: --cloud-provider="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850848 4962 flags.go:64] FLAG: --cluster-dns="[]" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850854 4962 flags.go:64] FLAG: --cluster-domain="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850860 4962 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850866 4962 flags.go:64] FLAG: --config-dir="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850872 4962 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850921 4962 flags.go:64] FLAG: --container-log-max-files="5" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850931 4962 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850937 4962 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850943 4962 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850949 4962 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850956 4962 flags.go:64] FLAG: --contention-profiling="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850962 4962 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850967 4962 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850972 4962 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850978 4962 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850985 4962 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850990 4962 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.850996 4962 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851001 4962 flags.go:64] FLAG: --enable-load-reader="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851006 4962 flags.go:64] FLAG: --enable-server="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851011 4962 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851018 4962 flags.go:64] FLAG: --event-burst="100" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851024 4962 flags.go:64] FLAG: --event-qps="50" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851029 4962 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851034 4962 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851039 4962 flags.go:64] FLAG: --eviction-hard="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851046 4962 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851051 4962 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851056 4962 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851061 4962 flags.go:64] FLAG: --eviction-soft="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851066 4962 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851072 4962 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851077 4962 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851082 4962 flags.go:64] FLAG: --experimental-mounter-path="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851087 4962 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851092 4962 flags.go:64] FLAG: --fail-swap-on="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851097 4962 flags.go:64] FLAG: --feature-gates="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851104 4962 flags.go:64] FLAG: --file-check-frequency="20s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851109 4962 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851114 4962 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851119 4962 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851125 4962 flags.go:64] FLAG: --healthz-port="10248" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851132 4962 flags.go:64] FLAG: --help="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851137 4962 flags.go:64] FLAG: --hostname-override="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851141 4962 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851147 4962 flags.go:64] FLAG: --http-check-frequency="20s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851152 4962 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851157 4962 flags.go:64] FLAG: --image-credential-provider-config="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851162 4962 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851167 4962 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851172 4962 flags.go:64] FLAG: --image-service-endpoint="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851177 4962 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851182 4962 flags.go:64] FLAG: --kube-api-burst="100" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851187 4962 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851193 4962 flags.go:64] FLAG: --kube-api-qps="50" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851198 4962 flags.go:64] FLAG: --kube-reserved="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851203 4962 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851208 4962 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851213 4962 flags.go:64] FLAG: --kubelet-cgroups="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851218 4962 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851223 4962 flags.go:64] FLAG: --lock-file="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851228 4962 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851233 4962 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851239 4962 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851247 4962 flags.go:64] FLAG: --log-json-split-stream="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851251 4962 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851256 4962 flags.go:64] FLAG: --log-text-split-stream="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851264 4962 flags.go:64] FLAG: --logging-format="text" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851270 4962 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851275 4962 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851280 4962 flags.go:64] FLAG: --manifest-url="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851285 4962 flags.go:64] FLAG: --manifest-url-header="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851293 4962 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851298 4962 flags.go:64] FLAG: --max-open-files="1000000" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851307 4962 flags.go:64] FLAG: --max-pods="110" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851312 4962 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851317 4962 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851322 4962 flags.go:64] FLAG: --memory-manager-policy="None" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851327 4962 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851332 4962 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851337 4962 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851343 4962 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851355 4962 flags.go:64] FLAG: --node-status-max-images="50" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851360 4962 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851366 4962 flags.go:64] FLAG: --oom-score-adj="-999" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851371 4962 flags.go:64] FLAG: --pod-cidr="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851376 4962 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851385 4962 flags.go:64] FLAG: --pod-manifest-path="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851390 4962 flags.go:64] FLAG: --pod-max-pids="-1" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851396 4962 flags.go:64] FLAG: --pods-per-core="0" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851401 4962 flags.go:64] FLAG: --port="10250" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851406 4962 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851412 4962 flags.go:64] FLAG: --provider-id="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851417 4962 flags.go:64] FLAG: --qos-reserved="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851422 4962 flags.go:64] FLAG: --read-only-port="10255" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851427 4962 flags.go:64] FLAG: --register-node="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851432 4962 flags.go:64] FLAG: --register-schedulable="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851437 4962 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851448 4962 flags.go:64] FLAG: --registry-burst="10" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851454 4962 flags.go:64] FLAG: --registry-qps="5" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851460 4962 flags.go:64] FLAG: --reserved-cpus="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851465 4962 flags.go:64] FLAG: --reserved-memory="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851472 4962 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851480 4962 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851486 4962 flags.go:64] FLAG: --rotate-certificates="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851491 4962 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851499 4962 flags.go:64] FLAG: --runonce="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851505 4962 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851510 4962 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851516 4962 flags.go:64] FLAG: --seccomp-default="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851521 4962 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851527 4962 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851533 4962 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851539 4962 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851544 4962 flags.go:64] FLAG: --storage-driver-password="root" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851550 4962 flags.go:64] FLAG: --storage-driver-secure="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851556 4962 flags.go:64] FLAG: --storage-driver-table="stats" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851561 4962 flags.go:64] FLAG: --storage-driver-user="root" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851566 4962 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851571 4962 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851577 4962 flags.go:64] FLAG: --system-cgroups="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851582 4962 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851593 4962 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851599 4962 flags.go:64] FLAG: --tls-cert-file="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851624 4962 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851631 4962 flags.go:64] FLAG: --tls-min-version="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851637 4962 flags.go:64] FLAG: --tls-private-key-file="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851642 4962 flags.go:64] FLAG: --topology-manager-policy="none" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851647 4962 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851652 4962 flags.go:64] FLAG: --topology-manager-scope="container" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851659 4962 flags.go:64] FLAG: --v="2" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851666 4962 flags.go:64] FLAG: --version="false" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851673 4962 flags.go:64] FLAG: --vmodule="" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851680 4962 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.851686 4962 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851837 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851845 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851850 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851855 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851862 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851867 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851873 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851877 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851882 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851887 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851891 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851896 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851900 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851905 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851910 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851914 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851918 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851923 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851927 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851931 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851935 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851940 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851945 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851949 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851953 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851966 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851971 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851978 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851985 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851990 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.851997 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852003 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852009 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852013 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852018 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852023 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852028 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852035 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852041 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852047 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852059 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852065 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852070 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852074 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852079 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852083 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852088 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852092 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852097 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852102 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852106 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852111 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852115 4962 feature_gate.go:330] unrecognized feature gate: Example Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852120 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852125 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852129 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852134 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852138 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852143 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852148 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852153 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852157 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852161 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852166 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852171 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852175 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852179 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852183 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852190 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852195 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.852200 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.854240 4962 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.867189 4962 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.867243 4962 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867429 4962 feature_gate.go:330] unrecognized feature gate: Example Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867446 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867457 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867468 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867477 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867487 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867497 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867505 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867514 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867541 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867550 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867558 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867566 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867574 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867582 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867595 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867625 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867633 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867641 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867649 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867657 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867664 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867673 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867681 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867689 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867697 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867705 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867712 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867720 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867728 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867736 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867756 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867764 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867773 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867782 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867789 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867797 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867805 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867813 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867824 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867835 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867844 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867853 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867862 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867870 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867878 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867887 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867895 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867904 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867912 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867921 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867929 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867937 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867945 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867953 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867963 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867974 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867985 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.867994 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868002 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868011 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868019 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868027 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868034 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868042 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868050 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868058 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868077 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868085 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868093 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868101 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.868114 4962 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868906 4962 feature_gate.go:330] unrecognized feature gate: Example Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868922 4962 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868930 4962 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868940 4962 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868948 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868956 4962 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868967 4962 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868979 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868987 4962 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.868995 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869004 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869012 4962 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869020 4962 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869028 4962 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869036 4962 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869044 4962 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869052 4962 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869060 4962 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869068 4962 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869075 4962 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869083 4962 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869090 4962 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869099 4962 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869106 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869114 4962 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869122 4962 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869130 4962 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869138 4962 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869146 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869154 4962 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869162 4962 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869171 4962 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869183 4962 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869193 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869202 4962 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869211 4962 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869219 4962 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869227 4962 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869235 4962 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869243 4962 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869251 4962 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869259 4962 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869266 4962 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869274 4962 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869282 4962 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869290 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869297 4962 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869305 4962 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869313 4962 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869322 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869330 4962 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869341 4962 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869350 4962 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869359 4962 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869367 4962 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869375 4962 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869384 4962 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869393 4962 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869405 4962 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869414 4962 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869423 4962 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869432 4962 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869441 4962 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869450 4962 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869458 4962 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869466 4962 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869475 4962 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869484 4962 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869493 4962 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869501 4962 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 09:55:08 crc kubenswrapper[4962]: W0220 09:55:08.869509 4962 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.869521 4962 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.870744 4962 server.go:940] "Client rotation is on, will bootstrap in background" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.882336 4962 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.882550 4962 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.886658 4962 server.go:997] "Starting client certificate rotation" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.886708 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.889477 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-13 13:27:58.090666184 +0000 UTC Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.889650 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.917376 4962 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 09:55:08 crc kubenswrapper[4962]: E0220 09:55:08.920736 4962 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.921389 4962 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.937325 4962 log.go:25] "Validated CRI v1 runtime API" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.974653 4962 log.go:25] "Validated CRI v1 image API" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.976765 4962 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.983807 4962 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-20-09-49-36-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 20 09:55:08 crc kubenswrapper[4962]: I0220 09:55:08.983856 4962 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.018824 4962 manager.go:217] Machine: {Timestamp:2026-02-20 09:55:09.013547668 +0000 UTC m=+0.596019594 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0de0f937-0896-4e78-90b7-d2d7a1bed2a9 BootID:3742e1d0-bedb-4f62-b44c-22d7df4a090f Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:49:72:3c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:49:72:3c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:6c:a6:37 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:f8:41:62 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b6:b5:2b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1a:8c:e8 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:9a:ce:cb Speed:-1 Mtu:1496} {Name:eth10 MacAddress:de:c3:b4:1e:b3:88 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:6e:e9:98:b5:88:ae Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.019553 4962 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.020174 4962 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.020879 4962 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.021326 4962 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.021494 4962 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.021997 4962 topology_manager.go:138] "Creating topology manager with none policy" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.022109 4962 container_manager_linux.go:303] "Creating device plugin manager" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.023106 4962 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.023284 4962 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.023741 4962 state_mem.go:36] "Initialized new in-memory state store" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.024414 4962 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.028958 4962 kubelet.go:418] "Attempting to sync node with API server" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.029116 4962 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.029261 4962 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.029376 4962 kubelet.go:324] "Adding apiserver pod source" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.029524 4962 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.034952 4962 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 20 09:55:09 crc kubenswrapper[4962]: W0220 09:55:09.035695 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:09 crc kubenswrapper[4962]: W0220 09:55:09.035744 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.035882 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.035914 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.036582 4962 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.038971 4962 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.040775 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.040998 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.041160 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.041346 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.041523 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.041749 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.041918 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.042090 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.042245 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.042397 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.042580 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.042820 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.043792 4962 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.044983 4962 server.go:1280] "Started kubelet" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.045631 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.046737 4962 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.046720 4962 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 20 09:55:09 crc systemd[1]: Started Kubernetes Kubelet. Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.048324 4962 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.050676 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.050739 4962 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.051013 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 16:45:40.682367979 +0000 UTC Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.051202 4962 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.051229 4962 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.051351 4962 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.051365 4962 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.061008 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="200ms" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.061262 4962 factory.go:55] Registering systemd factory Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.061308 4962 factory.go:221] Registration of the systemd container factory successfully Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.061753 4962 server.go:460] "Adding debug handlers to kubelet server" Feb 20 09:55:09 crc kubenswrapper[4962]: W0220 09:55:09.061929 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.062067 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.062209 4962 factory.go:153] Registering CRI-O factory Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.062244 4962 factory.go:221] Registration of the crio container factory successfully Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.062358 4962 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.062394 4962 factory.go:103] Registering Raw factory Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.062423 4962 manager.go:1196] Started watching for new ooms in manager Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.067813 4962 manager.go:319] Starting recovery of all containers Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.067932 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.103:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1895ebd07858256a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 09:55:09.044925802 +0000 UTC m=+0.627397728,LastTimestamp:2026-02-20 09:55:09.044925802 +0000 UTC m=+0.627397728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077175 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077248 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077278 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077307 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077332 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077357 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077376 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077395 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077459 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077479 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077502 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077522 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077566 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077596 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077648 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077665 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077684 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077702 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077720 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077738 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077756 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077775 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077793 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077810 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077828 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077846 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077867 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077887 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077907 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077926 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077947 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077968 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.077988 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078007 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078025 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078043 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078061 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078081 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078100 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078149 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078167 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078186 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078280 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078311 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078341 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078364 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078389 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078412 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078434 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078456 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078481 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078504 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078535 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078562 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078598 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078667 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078696 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078724 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078749 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078772 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078797 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078821 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078846 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078869 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078893 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078919 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078940 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078966 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.078988 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079010 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079034 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079060 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079085 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079110 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079133 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079156 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079181 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079205 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079230 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079255 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079280 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079309 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079334 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079357 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079378 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079402 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079426 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079447 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079478 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079502 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079527 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079550 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079576 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079637 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079669 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079693 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079722 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079745 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079768 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079792 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079834 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079860 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079885 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079910 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079947 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.079977 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080005 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080031 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080059 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080113 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080146 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080172 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080200 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080228 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080285 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080317 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080345 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080373 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080401 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080427 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080453 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080481 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080508 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080533 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080560 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080587 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080697 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080751 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080778 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080809 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080835 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080864 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080890 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080915 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080942 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080967 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.080992 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081018 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081047 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081074 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081100 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081124 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081154 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081181 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081205 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081230 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081255 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081282 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081307 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081331 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081356 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081382 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081407 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081432 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081460 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081487 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081517 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.081542 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084368 4962 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084423 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084454 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084482 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084507 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084534 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084559 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084588 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084649 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084677 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084702 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084755 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084783 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084808 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084835 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084863 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084909 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084936 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084964 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.084992 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085019 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085047 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085075 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085098 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085139 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085167 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085193 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085219 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085248 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085276 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085307 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085335 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085365 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085395 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085434 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085458 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085487 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085516 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085541 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085567 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085632 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085663 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085688 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085715 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085744 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085770 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085796 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085823 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085851 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085878 4962 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085901 4962 reconstruct.go:97] "Volume reconstruction finished" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.085918 4962 reconciler.go:26] "Reconciler: start to sync state" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.105753 4962 manager.go:324] Recovery completed Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.124093 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.127636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.127685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.127734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.128682 4962 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.128751 4962 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.128838 4962 state_mem.go:36] "Initialized new in-memory state store" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.134745 4962 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.136980 4962 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.137545 4962 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.137581 4962 kubelet.go:2335] "Starting kubelet main sync loop" Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.137646 4962 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 20 09:55:09 crc kubenswrapper[4962]: W0220 09:55:09.142200 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.142263 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.145812 4962 policy_none.go:49] "None policy: Start" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.146613 4962 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.146640 4962 state_mem.go:35] "Initializing new in-memory state store" Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.151799 4962 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.200918 4962 manager.go:334] "Starting Device Plugin manager" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.201005 4962 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.201029 4962 server.go:79] "Starting device plugin registration server" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.201862 4962 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.201896 4962 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.202253 4962 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.202400 4962 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.202422 4962 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.210845 4962 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.238125 4962 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.238266 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.239781 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.239814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.239823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.239950 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.240717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.240802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.240826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.241499 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.241633 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.241646 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.241699 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.241774 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.243576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.243637 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.243650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.243655 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.243681 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.243731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.243744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.243784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.243801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.244028 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.244109 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.244150 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.245675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.245715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.245728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.245824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.245858 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.245873 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.246075 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.246306 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.246392 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.247020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.247068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.247089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.247372 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.247427 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.247784 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.247852 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.247871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.248763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.248789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.248800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.262141 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="400ms" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288141 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288211 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288264 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288339 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288438 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288493 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288529 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288655 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288717 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288768 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288815 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288873 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288894 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.288913 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.303799 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.305548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.305640 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.305660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.305699 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.306180 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.103:6443: connect: connection refused" node="crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390320 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390393 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390424 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390449 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390494 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390508 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390544 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390564 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390502 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390587 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390553 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390584 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390516 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390656 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390679 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390697 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390698 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390750 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390759 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390725 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390766 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390796 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390793 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390824 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390903 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.390795 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.507249 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.509219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.509261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.509274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.509299 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.509669 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.103:6443: connect: connection refused" node="crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.575772 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.583922 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.595814 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.611275 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.622573 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:09 crc kubenswrapper[4962]: W0220 09:55:09.631531 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-c7951c3d50e9c2fc9b77a6ab3b9650e2ba8ad3b9be14711ad702d031dfdaf1a6 WatchSource:0}: Error finding container c7951c3d50e9c2fc9b77a6ab3b9650e2ba8ad3b9be14711ad702d031dfdaf1a6: Status 404 returned error can't find the container with id c7951c3d50e9c2fc9b77a6ab3b9650e2ba8ad3b9be14711ad702d031dfdaf1a6 Feb 20 09:55:09 crc kubenswrapper[4962]: W0220 09:55:09.633541 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-8248272d8f2a54dfb0468128646aa58e2845a9406415c86d6687d7282fb0f5ea WatchSource:0}: Error finding container 8248272d8f2a54dfb0468128646aa58e2845a9406415c86d6687d7282fb0f5ea: Status 404 returned error can't find the container with id 8248272d8f2a54dfb0468128646aa58e2845a9406415c86d6687d7282fb0f5ea Feb 20 09:55:09 crc kubenswrapper[4962]: W0220 09:55:09.640071 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-b3da178ca6cbe80b162b88ca43f450af9b0cbc4cf1a899f17908aa08b90fb495 WatchSource:0}: Error finding container b3da178ca6cbe80b162b88ca43f450af9b0cbc4cf1a899f17908aa08b90fb495: Status 404 returned error can't find the container with id b3da178ca6cbe80b162b88ca43f450af9b0cbc4cf1a899f17908aa08b90fb495 Feb 20 09:55:09 crc kubenswrapper[4962]: W0220 09:55:09.656016 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-9c4423ab2453f310252723006d893ed510df6839e76be2acfa6302528f8cd0c2 WatchSource:0}: Error finding container 9c4423ab2453f310252723006d893ed510df6839e76be2acfa6302528f8cd0c2: Status 404 returned error can't find the container with id 9c4423ab2453f310252723006d893ed510df6839e76be2acfa6302528f8cd0c2 Feb 20 09:55:09 crc kubenswrapper[4962]: W0220 09:55:09.661705 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-d4f2b26ed9c34142abbc43fe4b5267d89f6b3cc5c80c0c8a01c98d53bcc7e39a WatchSource:0}: Error finding container d4f2b26ed9c34142abbc43fe4b5267d89f6b3cc5c80c0c8a01c98d53bcc7e39a: Status 404 returned error can't find the container with id d4f2b26ed9c34142abbc43fe4b5267d89f6b3cc5c80c0c8a01c98d53bcc7e39a Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.663734 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="800ms" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.910418 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.911788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.911846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.911861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:09 crc kubenswrapper[4962]: I0220 09:55:09.911891 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 09:55:09 crc kubenswrapper[4962]: E0220 09:55:09.912436 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.103:6443: connect: connection refused" node="crc" Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.046514 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.051694 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 18:54:08.981495155 +0000 UTC Feb 20 09:55:10 crc kubenswrapper[4962]: W0220 09:55:10.072314 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:10 crc kubenswrapper[4962]: E0220 09:55:10.072376 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.141909 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9c4423ab2453f310252723006d893ed510df6839e76be2acfa6302528f8cd0c2"} Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.143320 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b3da178ca6cbe80b162b88ca43f450af9b0cbc4cf1a899f17908aa08b90fb495"} Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.144867 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8248272d8f2a54dfb0468128646aa58e2845a9406415c86d6687d7282fb0f5ea"} Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.146017 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c7951c3d50e9c2fc9b77a6ab3b9650e2ba8ad3b9be14711ad702d031dfdaf1a6"} Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.147390 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d4f2b26ed9c34142abbc43fe4b5267d89f6b3cc5c80c0c8a01c98d53bcc7e39a"} Feb 20 09:55:10 crc kubenswrapper[4962]: W0220 09:55:10.164659 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:10 crc kubenswrapper[4962]: E0220 09:55:10.164737 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:10 crc kubenswrapper[4962]: W0220 09:55:10.431850 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:10 crc kubenswrapper[4962]: E0220 09:55:10.432294 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:10 crc kubenswrapper[4962]: W0220 09:55:10.441927 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:10 crc kubenswrapper[4962]: E0220 09:55:10.442000 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:10 crc kubenswrapper[4962]: E0220 09:55:10.465270 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="1.6s" Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.713476 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.715378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.715434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.715449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.715478 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 09:55:10 crc kubenswrapper[4962]: E0220 09:55:10.716095 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.103:6443: connect: connection refused" node="crc" Feb 20 09:55:10 crc kubenswrapper[4962]: E0220 09:55:10.716638 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.103:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1895ebd07858256a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 09:55:09.044925802 +0000 UTC m=+0.627397728,LastTimestamp:2026-02-20 09:55:09.044925802 +0000 UTC m=+0.627397728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 09:55:10 crc kubenswrapper[4962]: I0220 09:55:10.932940 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 09:55:10 crc kubenswrapper[4962]: E0220 09:55:10.934077 4962 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.047461 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.052579 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 10:55:25.573201299 +0000 UTC Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.152531 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04" exitCode=0 Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.152661 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04"} Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.152710 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.153820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.153860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.153887 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.154663 4962 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631" exitCode=0 Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.154742 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631"} Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.154844 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.155949 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.156487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.156523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.156541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.156767 4962 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26" exitCode=0 Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.156815 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26"} Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.156839 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.157204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.157222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.157230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.158135 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.158165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.158180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.161469 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.161510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd"} Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.161555 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850"} Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.161652 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed"} Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.161684 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce"} Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.162571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.162589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.162612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.163664 4962 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f" exitCode=0 Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.163694 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f"} Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.163774 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.164738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.164767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:11 crc kubenswrapper[4962]: I0220 09:55:11.164777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.047372 4962 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.052682 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 10:57:36.512200711 +0000 UTC Feb 20 09:55:12 crc kubenswrapper[4962]: E0220 09:55:12.066451 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="3.2s" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.167433 4962 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59" exitCode=0 Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.167527 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59"} Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.167625 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.168405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.168432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.168442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.169276 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"faa1dbe5648fbe736a165150e168243abc4486420bf78e560c86ec9cc6a608c6"} Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.169358 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.170209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.170230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.170238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.171166 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9"} Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.171252 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2"} Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.174858 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5"} Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.174909 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc"} Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.174930 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.175777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.175804 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.175813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.316718 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.318043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.318095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.318109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.318140 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 09:55:12 crc kubenswrapper[4962]: E0220 09:55:12.318630 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.103:6443: connect: connection refused" node="crc" Feb 20 09:55:12 crc kubenswrapper[4962]: W0220 09:55:12.430687 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:12 crc kubenswrapper[4962]: E0220 09:55:12.430774 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.457707 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:12 crc kubenswrapper[4962]: I0220 09:55:12.465676 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:12 crc kubenswrapper[4962]: W0220 09:55:12.892584 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Feb 20 09:55:12 crc kubenswrapper[4962]: E0220 09:55:12.892698 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.053037 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 09:33:27.032141346 +0000 UTC Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.183842 4962 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248" exitCode=0 Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.183918 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248"} Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.184015 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.185318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.185374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.185393 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.187307 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6"} Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.187499 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.188199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.188343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.188494 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.191749 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.191943 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed"} Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.192021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c"} Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.191895 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.192045 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01"} Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.192044 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.193398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.193446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.193466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.193604 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.193685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.193740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.194255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.194335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.194388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:13 crc kubenswrapper[4962]: I0220 09:55:13.365702 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.053813 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 17:27:09.512294208 +0000 UTC Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.199340 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce"} Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.199391 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910"} Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.199407 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b"} Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.199417 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc"} Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.199452 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.199485 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.199495 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.199700 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.199941 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.199973 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.200686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.200709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.200717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.201536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.201556 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.201564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.201620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.201646 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.201659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.550473 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:14 crc kubenswrapper[4962]: I0220 09:55:14.947436 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.054874 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 12:59:45.566574018 +0000 UTC Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.211057 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea"} Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.211116 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.211499 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.211212 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.211321 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.211228 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.213061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.213129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.213149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.213436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.213490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.213508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.213840 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.213922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.213949 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.214238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.214891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.215122 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.519819 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.521577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.521689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.521707 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:15 crc kubenswrapper[4962]: I0220 09:55:15.521752 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.055026 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 04:54:37.272119126 +0000 UTC Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.213245 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.213487 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.213366 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.214613 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.214649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.214661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.215066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.215167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.215255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.465978 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 20 09:55:16 crc kubenswrapper[4962]: I0220 09:55:16.569287 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.056011 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 21:29:42.362354862 +0000 UTC Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.217266 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.217341 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.218995 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.219170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.219397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.219175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.219671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.219695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.341457 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.341737 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.343290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.343377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.343399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.551558 4962 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.551740 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 20 09:55:17 crc kubenswrapper[4962]: I0220 09:55:17.650269 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:18 crc kubenswrapper[4962]: I0220 09:55:18.056335 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 03:03:23.970947894 +0000 UTC Feb 20 09:55:18 crc kubenswrapper[4962]: I0220 09:55:18.220734 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:18 crc kubenswrapper[4962]: I0220 09:55:18.221699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:18 crc kubenswrapper[4962]: I0220 09:55:18.221733 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:18 crc kubenswrapper[4962]: I0220 09:55:18.221743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:19 crc kubenswrapper[4962]: I0220 09:55:19.057234 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 03:55:36.695294464 +0000 UTC Feb 20 09:55:19 crc kubenswrapper[4962]: E0220 09:55:19.210956 4962 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 20 09:55:20 crc kubenswrapper[4962]: I0220 09:55:20.058219 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 13:27:57.806237277 +0000 UTC Feb 20 09:55:21 crc kubenswrapper[4962]: I0220 09:55:21.058834 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 07:36:56.116524787 +0000 UTC Feb 20 09:55:22 crc kubenswrapper[4962]: I0220 09:55:22.059053 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 10:20:31.550706679 +0000 UTC Feb 20 09:55:22 crc kubenswrapper[4962]: I0220 09:55:22.397544 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 20 09:55:22 crc kubenswrapper[4962]: I0220 09:55:22.397846 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:22 crc kubenswrapper[4962]: I0220 09:55:22.399493 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:22 crc kubenswrapper[4962]: I0220 09:55:22.399529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:22 crc kubenswrapper[4962]: I0220 09:55:22.399541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:22 crc kubenswrapper[4962]: W0220 09:55:22.964484 4962 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 20 09:55:22 crc kubenswrapper[4962]: I0220 09:55:22.964679 4962 trace.go:236] Trace[1520764001]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 09:55:12.963) (total time: 10001ms): Feb 20 09:55:22 crc kubenswrapper[4962]: Trace[1520764001]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (09:55:22.964) Feb 20 09:55:22 crc kubenswrapper[4962]: Trace[1520764001]: [10.001098173s] [10.001098173s] END Feb 20 09:55:22 crc kubenswrapper[4962]: E0220 09:55:22.964711 4962 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 20 09:55:23 crc kubenswrapper[4962]: I0220 09:55:23.012480 4962 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 20 09:55:23 crc kubenswrapper[4962]: I0220 09:55:23.012544 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 20 09:55:23 crc kubenswrapper[4962]: I0220 09:55:23.031400 4962 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 20 09:55:23 crc kubenswrapper[4962]: I0220 09:55:23.031473 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 20 09:55:23 crc kubenswrapper[4962]: I0220 09:55:23.059389 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 10:30:39.326401687 +0000 UTC Feb 20 09:55:24 crc kubenswrapper[4962]: I0220 09:55:24.060384 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 05:16:51.963418519 +0000 UTC Feb 20 09:55:25 crc kubenswrapper[4962]: I0220 09:55:25.061058 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 17:57:30.389532991 +0000 UTC Feb 20 09:55:26 crc kubenswrapper[4962]: I0220 09:55:26.061508 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 05:03:03.621450828 +0000 UTC Feb 20 09:55:26 crc kubenswrapper[4962]: I0220 09:55:26.576388 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:26 crc kubenswrapper[4962]: I0220 09:55:26.576589 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:26 crc kubenswrapper[4962]: I0220 09:55:26.578912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:26 crc kubenswrapper[4962]: I0220 09:55:26.578995 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:26 crc kubenswrapper[4962]: I0220 09:55:26.579019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:27 crc kubenswrapper[4962]: I0220 09:55:27.062244 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 00:30:14.314693553 +0000 UTC Feb 20 09:55:27 crc kubenswrapper[4962]: I0220 09:55:27.348444 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:27 crc kubenswrapper[4962]: I0220 09:55:27.348718 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:27 crc kubenswrapper[4962]: I0220 09:55:27.350546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:27 crc kubenswrapper[4962]: I0220 09:55:27.350648 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:27 crc kubenswrapper[4962]: I0220 09:55:27.350672 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:27 crc kubenswrapper[4962]: I0220 09:55:27.354804 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:27 crc kubenswrapper[4962]: I0220 09:55:27.551102 4962 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 09:55:27 crc kubenswrapper[4962]: I0220 09:55:27.551555 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.016697 4962 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.031207 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.035881 4962 trace.go:236] Trace[1064268399]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 09:55:16.918) (total time: 11117ms): Feb 20 09:55:28 crc kubenswrapper[4962]: Trace[1064268399]: ---"Objects listed" error: 11117ms (09:55:28.035) Feb 20 09:55:28 crc kubenswrapper[4962]: Trace[1064268399]: [11.117793162s] [11.117793162s] END Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.035928 4962 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.036329 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.036398 4962 csr.go:261] certificate signing request csr-s2kkh is approved, waiting to be issued Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.042015 4962 trace.go:236] Trace[871747541]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 09:55:13.577) (total time: 14464ms): Feb 20 09:55:28 crc kubenswrapper[4962]: Trace[871747541]: ---"Objects listed" error: 14464ms (09:55:28.041) Feb 20 09:55:28 crc kubenswrapper[4962]: Trace[871747541]: [14.464075109s] [14.464075109s] END Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.042078 4962 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.042712 4962 apiserver.go:52] "Watching apiserver" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.042718 4962 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.045575 4962 trace.go:236] Trace[810486363]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 09:55:17.833) (total time: 10211ms): Feb 20 09:55:28 crc kubenswrapper[4962]: Trace[810486363]: ---"Objects listed" error: 10209ms (09:55:28.043) Feb 20 09:55:28 crc kubenswrapper[4962]: Trace[810486363]: [10.211979398s] [10.211979398s] END Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.045616 4962 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.047533 4962 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.047997 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.048659 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.048767 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.049054 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.049177 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.049192 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.049314 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.049661 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.049717 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.049744 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.049820 4962 csr.go:257] certificate signing request csr-s2kkh is issued Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.052726 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.052900 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.053353 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.053480 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.053514 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.053927 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.054252 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.054405 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.054917 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.054997 4962 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.062913 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 17:59:03.33614956 +0000 UTC Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.084498 4962 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37862->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.084557 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37862->192.168.126.11:17697: read: connection reset by peer" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.084555 4962 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37868->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.084647 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37868->192.168.126.11:17697: read: connection reset by peer" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.092585 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.105044 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.129101 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143654 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143749 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143783 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143825 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143854 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143888 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143943 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143994 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144024 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144051 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144082 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144113 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144145 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144207 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144263 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144293 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144325 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144356 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144385 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144445 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144474 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144505 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144535 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144568 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144621 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144654 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144683 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144712 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144748 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144779 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144808 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144839 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144872 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144907 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144067 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144937 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144109 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144278 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144800 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144899 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144969 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145030 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145105 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145136 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145164 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145193 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145223 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145239 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145260 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145253 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145343 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145365 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145383 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145417 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145444 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145452 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145658 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145665 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145822 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146054 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146176 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146295 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146510 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146638 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145463 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146701 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146731 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146756 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146780 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146805 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146828 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146843 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146850 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146871 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146930 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146963 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146987 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146985 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147013 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147089 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147111 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147112 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147128 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147150 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147158 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147170 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147187 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147205 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147224 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147240 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147258 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147273 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147289 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147337 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147353 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147368 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147382 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147398 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147413 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147428 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147443 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147460 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147478 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147496 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147511 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147525 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147541 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147558 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147604 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147621 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147638 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147652 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147669 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147688 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147705 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147721 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147736 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147753 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147773 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147788 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147803 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147820 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147834 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147851 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147866 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147881 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147895 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147910 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147925 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147939 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147956 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147972 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147987 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148004 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148021 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148052 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148070 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148085 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148133 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148149 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148165 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148182 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148271 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148288 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148305 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148338 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148356 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148388 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148405 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148421 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148438 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148454 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148470 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148486 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148503 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148521 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148537 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148554 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148570 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149144 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149164 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149179 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149196 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149217 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149233 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149269 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149286 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149302 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149318 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149334 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149350 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149371 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149386 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149425 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149444 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149479 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149499 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149520 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149542 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149562 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149581 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149613 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149631 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149697 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149715 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149733 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149752 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149770 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149787 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149803 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149819 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149835 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149851 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149869 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149885 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149901 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149918 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149937 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149956 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149972 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149988 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150004 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150019 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150053 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150068 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150086 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150150 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150171 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150190 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150210 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150231 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150268 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150284 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150305 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150342 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150360 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150430 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150442 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150453 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150463 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150473 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150483 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150493 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150502 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150513 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150524 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150536 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150549 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150562 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150575 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150605 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150616 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150626 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150637 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150646 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150656 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150668 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150677 4962 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154245 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.156942 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157745 4962 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158886 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.163420 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147424 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147371 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147639 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147656 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148045 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148122 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148271 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148332 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148405 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148416 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148505 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148540 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148708 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148762 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149042 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149061 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150275 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150638 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150800 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150859 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.151108 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.151164 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.151247 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.151446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.151818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152087 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152425 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152435 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152680 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152784 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152972 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.153083 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.153294 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.153451 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.153889 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154146 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154440 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154665 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154671 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154718 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154766 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154885 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154909 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155000 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155041 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155063 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155061 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155687 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155834 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155999 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.156434 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.156713 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.156723 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.156808 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157105 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157447 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157483 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157492 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157552 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157550 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158169 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158487 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158585 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158633 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158649 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159148 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159153 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159241 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159453 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159466 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159510 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159862 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.160101 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.160569 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.160696 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162322 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162516 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162515 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162544 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162560 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162813 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.163021 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.165851 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.166754 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.167323 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.167338 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.167795 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.168051 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.168279 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.168806 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.171946 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.171994 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.172082 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.172253 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.172282 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.172675 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.173897 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.174873 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.175041 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.175577 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.175682 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.175759 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.175739 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176122 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176199 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176574 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176755 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176993 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177001 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176977 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177071 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177349 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177656 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177709 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177742 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.178252 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.178360 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:28.665669704 +0000 UTC m=+20.248141560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.178539 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:28.678393878 +0000 UTC m=+20.260865724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.178763 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.178817 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.178826 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.178935 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.179031 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.179114 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:28.679097101 +0000 UTC m=+20.261568947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.179577 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.179884 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180074 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180094 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180197 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180377 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180410 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180562 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180458 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180934 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.181358 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.183663 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.183848 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.183877 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.183980 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:28.683958535 +0000 UTC m=+20.266430381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.184253 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.184750 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.185174 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.185392 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.185897 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.185904 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.187765 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.189691 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.187833 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.189728 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.190891 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.191307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.192441 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.192478 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.192495 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.192563 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:28.692541388 +0000 UTC m=+20.275013454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.193245 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.193372 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.194066 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.194417 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.194555 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.194758 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.194914 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.195365 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.196045 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.196226 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.197063 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.197645 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.197774 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.199745 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.199857 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.200262 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.200757 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.201374 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.201438 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.201569 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.201871 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.202666 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.202927 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.203060 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.203069 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.203303 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.203950 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.205293 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.205279 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.205579 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.205757 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.205783 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.208586 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.219050 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.224837 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.238455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.238945 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251188 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251240 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251295 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251325 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251335 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251346 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251355 4962 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251367 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251400 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251418 4962 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251427 4962 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251436 4962 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251446 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251455 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251482 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251492 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251501 4962 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251510 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251520 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251529 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251553 4962 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251563 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251572 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251582 4962 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251774 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251914 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251928 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251937 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252116 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252139 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252151 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252162 4962 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252189 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252183 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252199 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252300 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252314 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252340 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252352 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252364 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252378 4962 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252390 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252401 4962 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252416 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252427 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252438 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252451 4962 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252462 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252473 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252485 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252496 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252509 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252520 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252531 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252542 4962 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252553 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252565 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252577 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252609 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252621 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252631 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252643 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252655 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252673 4962 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252685 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252697 4962 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252709 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252720 4962 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252732 4962 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252743 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252754 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252764 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252775 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252786 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252797 4962 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252808 4962 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252817 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252826 4962 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252835 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252844 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252855 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252865 4962 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252873 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252881 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252889 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252896 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252904 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252913 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252921 4962 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252928 4962 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252936 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252944 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252952 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252960 4962 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252968 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252979 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252987 4962 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252995 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253004 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253013 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253023 4962 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253031 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253039 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253049 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253058 4962 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253066 4962 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253074 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253082 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253090 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253098 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253107 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253115 4962 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253123 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253131 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253139 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253148 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253158 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253170 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253180 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253193 4962 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253204 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253214 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253224 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253235 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253246 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253258 4962 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253267 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253277 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253290 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253301 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253313 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253325 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253336 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253348 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253357 4962 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253366 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253374 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253384 4962 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254324 4962 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254342 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254361 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254372 4962 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254397 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254408 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254417 4962 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254425 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254433 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254441 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254449 4962 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254456 4962 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254465 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254473 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254481 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254489 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254497 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254509 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254517 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254525 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254534 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254542 4962 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254551 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254559 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254568 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254577 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254585 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254615 4962 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254625 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254632 4962 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254641 4962 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254649 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254657 4962 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254665 4962 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254672 4962 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254680 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254690 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254702 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254711 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.267338 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed" exitCode=255 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.267397 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed"} Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.280069 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.291536 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.302433 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.312487 4962 scope.go:117] "RemoveContainer" containerID="e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.312859 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.315469 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.331098 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.347497 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.369890 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.382101 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.384135 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-e93f6c4f31e43247e8c3c1427ad99fc52f7d208ff6834da1bbb7d899f683b6a7 WatchSource:0}: Error finding container e93f6c4f31e43247e8c3c1427ad99fc52f7d208ff6834da1bbb7d899f683b6a7: Status 404 returned error can't find the container with id e93f6c4f31e43247e8c3c1427ad99fc52f7d208ff6834da1bbb7d899f683b6a7 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.391633 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.409794 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-5fe2678ed1ae6f7a6aaeac8ebf1de18ac37c401a8d715114787b6e3efaf3edc7 WatchSource:0}: Error finding container 5fe2678ed1ae6f7a6aaeac8ebf1de18ac37c401a8d715114787b6e3efaf3edc7: Status 404 returned error can't find the container with id 5fe2678ed1ae6f7a6aaeac8ebf1de18ac37c401a8d715114787b6e3efaf3edc7 Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.411692 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-959b3593c1218d1c3b5846f7057bf9a4e00402d70673ddcae7eb952d6cc2b7a7 WatchSource:0}: Error finding container 959b3593c1218d1c3b5846f7057bf9a4e00402d70673ddcae7eb952d6cc2b7a7: Status 404 returned error can't find the container with id 959b3593c1218d1c3b5846f7057bf9a4e00402d70673ddcae7eb952d6cc2b7a7 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.523922 4962 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.758920 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.758985 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.759007 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.759030 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.759048 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759158 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759173 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759184 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759223 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:29.759210381 +0000 UTC m=+21.341682227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759267 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:29.759262453 +0000 UTC m=+21.341734299 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759295 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759322 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:29.759317514 +0000 UTC m=+21.341789360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759358 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759376 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:29.759370736 +0000 UTC m=+21.341842582 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759410 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759419 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759426 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759443 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:29.759437948 +0000 UTC m=+21.341909784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.883918 4962 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884107 4962 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884133 4962 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884154 4962 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884169 4962 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884184 4962 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884199 4962 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.884248 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": read tcp 38.102.83.103:38906->38.102.83.103:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1895ebd09c196677 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 09:55:09.644793463 +0000 UTC m=+1.227265349,LastTimestamp:2026-02-20 09:55:09.644793463 +0000 UTC m=+1.227265349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884356 4962 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884377 4962 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884394 4962 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884409 4962 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884424 4962 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884439 4962 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884959 4962 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.051798 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-20 09:50:28 +0000 UTC, rotation deadline is 2026-12-04 18:42:22.334820915 +0000 UTC Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.052098 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6896h46m53.282727943s for next certificate rotation Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.063035 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 09:44:32.954128211 +0000 UTC Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.143625 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.145053 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.147557 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.149014 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.151259 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.152484 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.153912 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.155764 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.156301 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.157323 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.159415 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.160279 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.161019 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.161539 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.162087 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.163888 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.164811 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.170817 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.171437 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.172923 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.173884 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.174534 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.175955 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.176552 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.179996 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.180964 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.183282 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.184865 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.186852 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.188041 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.193462 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.194470 4962 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.194708 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.199553 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.200297 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.200543 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.200929 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.203576 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.204838 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.206132 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.207002 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.208405 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.209066 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.210287 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.211088 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.212090 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.212586 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.213780 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.214324 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.215477 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.215987 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.217011 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.217642 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.218137 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.218695 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.219530 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.222810 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.244466 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.272370 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.273334 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.274238 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.274946 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.276256 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.276278 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.276293 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"959b3593c1218d1c3b5846f7057bf9a4e00402d70673ddcae7eb952d6cc2b7a7"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.277456 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5fe2678ed1ae6f7a6aaeac8ebf1de18ac37c401a8d715114787b6e3efaf3edc7"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.278775 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.278818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e93f6c4f31e43247e8c3c1427ad99fc52f7d208ff6834da1bbb7d899f683b6a7"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.299361 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.320693 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.340499 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.356785 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.374020 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.389156 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.401817 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.413840 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.430243 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.553501 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-s8xxr"] Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.553789 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.555216 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.555828 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.556821 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.579986 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.597438 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.620934 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.647890 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.665737 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.671304 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a431054f-57c5-41b7-93b2-2d2fbf9949ce-hosts-file\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.671340 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9fz6\" (UniqueName: \"kubernetes.io/projected/a431054f-57c5-41b7-93b2-2d2fbf9949ce-kube-api-access-p9fz6\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.703171 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.726476 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.742430 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.771871 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.771965 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772073 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772098 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a431054f-57c5-41b7-93b2-2d2fbf9949ce-hosts-file\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772119 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9fz6\" (UniqueName: \"kubernetes.io/projected/a431054f-57c5-41b7-93b2-2d2fbf9949ce-kube-api-access-p9fz6\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772140 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772162 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772268 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772322 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:31.772306761 +0000 UTC m=+23.354778607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772384 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:31.772376603 +0000 UTC m=+23.354848449 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772453 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772471 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772483 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772508 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:31.772501117 +0000 UTC m=+23.354972963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772563 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772573 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772581 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772630 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:31.77262186 +0000 UTC m=+23.355093706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772679 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a431054f-57c5-41b7-93b2-2d2fbf9949ce-hosts-file\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772869 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.773006 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:31.772980762 +0000 UTC m=+23.355452608 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.789642 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9fz6\" (UniqueName: \"kubernetes.io/projected/a431054f-57c5-41b7-93b2-2d2fbf9949ce-kube-api-access-p9fz6\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.800583 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.803058 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.864514 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: W0220 09:55:29.945058 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda431054f_57c5_41b7_93b2_2d2fbf9949ce.slice/crio-5def5c7723be7246f59094cb5773b98bc3c5d5cefc8592a4e372fc13611b133a WatchSource:0}: Error finding container 5def5c7723be7246f59094cb5773b98bc3c5d5cefc8592a4e372fc13611b133a: Status 404 returned error can't find the container with id 5def5c7723be7246f59094cb5773b98bc3c5d5cefc8592a4e372fc13611b133a Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.972030 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.049982 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.064312 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 20:02:54.255647214 +0000 UTC Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.099921 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.138713 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.138827 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:30 crc kubenswrapper[4962]: E0220 09:55:30.138932 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.138852 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:30 crc kubenswrapper[4962]: E0220 09:55:30.139098 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:30 crc kubenswrapper[4962]: E0220 09:55:30.139275 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.213447 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.253494 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.283330 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s8xxr" event={"ID":"a431054f-57c5-41b7-93b2-2d2fbf9949ce","Type":"ContainerStarted","Data":"15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf"} Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.283395 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s8xxr" event={"ID":"a431054f-57c5-41b7-93b2-2d2fbf9949ce","Type":"ContainerStarted","Data":"5def5c7723be7246f59094cb5773b98bc3c5d5cefc8592a4e372fc13611b133a"} Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.295179 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.299611 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.315388 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.366916 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.423066 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.432522 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.448507 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wqwgj"] Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.449004 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.451820 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.452730 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.452935 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.458955 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.459531 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.459718 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-7hj8w"] Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.459820 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.460283 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.462158 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-99b2s"] Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.462757 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.472383 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.472447 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.472929 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.473038 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-m9d46"] Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.474241 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.484795 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.484865 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.485513 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.485714 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.485928 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.486064 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.486190 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.486313 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.486471 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.488862 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.493186 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.510953 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.527438 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.536860 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.550651 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.568640 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578177 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578304 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578348 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578392 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-bin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-os-release\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578462 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66n7s\" (UniqueName: \"kubernetes.io/projected/ef72d73c-d177-4436-b681-83866e1f6d12-kube-api-access-66n7s\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578487 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578512 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578541 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578580 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-socket-dir-parent\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578624 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-cnibin\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578672 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578729 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-os-release\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578758 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-daemon-config\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578803 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-multus-certs\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578846 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578934 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578976 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579007 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579033 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579054 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579074 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-multus\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579094 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/751d5e0b-919c-4777-8475-ed7214f7647f-proxy-tls\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579135 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85mbt\" (UniqueName: \"kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579154 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-system-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579209 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579225 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579414 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-hostroot\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579457 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzq9p\" (UniqueName: \"kubernetes.io/projected/751d5e0b-919c-4777-8475-ed7214f7647f-kube-api-access-rzq9p\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579484 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-system-cni-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579519 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-kubelet\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579543 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cnibin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579654 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cni-binary-copy\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580001 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580218 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580259 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwxzh\" (UniqueName: \"kubernetes.io/projected/1957ac70-30f9-48c2-a82b-72aa3b7a883a-kube-api-access-fwxzh\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580280 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/751d5e0b-919c-4777-8475-ed7214f7647f-mcd-auth-proxy-config\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580351 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-conf-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580369 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-etc-kubernetes\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580391 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-k8s-cni-cncf-io\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580415 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-netns\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580458 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580485 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/751d5e0b-919c-4777-8475-ed7214f7647f-rootfs\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.585737 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.598819 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.611482 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.623030 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.638213 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.652159 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.663838 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681119 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681191 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681221 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwxzh\" (UniqueName: \"kubernetes.io/projected/1957ac70-30f9-48c2-a82b-72aa3b7a883a-kube-api-access-fwxzh\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681247 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/751d5e0b-919c-4777-8475-ed7214f7647f-mcd-auth-proxy-config\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681275 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681305 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-etc-kubernetes\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681332 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681389 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-conf-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681351 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-etc-kubernetes\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681358 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-conf-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681437 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681460 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-k8s-cni-cncf-io\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681483 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-netns\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681518 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/751d5e0b-919c-4777-8475-ed7214f7647f-rootfs\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681561 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681538 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681618 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-k8s-cni-cncf-io\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681630 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681650 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-netns\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681660 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681682 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/751d5e0b-919c-4777-8475-ed7214f7647f-rootfs\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681702 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681722 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681728 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-bin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681752 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-os-release\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681778 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66n7s\" (UniqueName: \"kubernetes.io/projected/ef72d73c-d177-4436-b681-83866e1f6d12-kube-api-access-66n7s\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681800 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681822 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681844 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-os-release\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681867 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-socket-dir-parent\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681891 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-cnibin\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681937 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681961 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-daemon-config\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681982 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-multus-certs\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682003 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682038 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682060 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682087 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682106 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-multus\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682148 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/751d5e0b-919c-4777-8475-ed7214f7647f-proxy-tls\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682190 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85mbt\" (UniqueName: \"kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682209 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-system-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682241 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682262 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682283 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682305 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-hostroot\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682327 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzq9p\" (UniqueName: \"kubernetes.io/projected/751d5e0b-919c-4777-8475-ed7214f7647f-kube-api-access-rzq9p\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682351 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-system-cni-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682373 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cnibin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682392 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-kubelet\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682416 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682435 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682454 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cni-binary-copy\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682488 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682537 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682567 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-bin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682684 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/751d5e0b-919c-4777-8475-ed7214f7647f-mcd-auth-proxy-config\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682736 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682791 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682795 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-os-release\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682744 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682832 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682855 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-socket-dir-parent\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682868 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682875 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-os-release\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682883 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-system-cni-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682908 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-multus\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682964 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682978 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683029 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-hostroot\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683087 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683062 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683108 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-cnibin\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683115 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-kubelet\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683138 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683146 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cnibin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683172 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-multus-certs\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683212 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-system-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683253 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683668 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683750 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cni-binary-copy\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683808 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.684026 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-daemon-config\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.684258 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.687260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.687901 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/751d5e0b-919c-4777-8475-ed7214f7647f-proxy-tls\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.699225 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66n7s\" (UniqueName: \"kubernetes.io/projected/ef72d73c-d177-4436-b681-83866e1f6d12-kube-api-access-66n7s\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.701742 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.702292 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzq9p\" (UniqueName: \"kubernetes.io/projected/751d5e0b-919c-4777-8475-ed7214f7647f-kube-api-access-rzq9p\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.708068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwxzh\" (UniqueName: \"kubernetes.io/projected/1957ac70-30f9-48c2-a82b-72aa3b7a883a-kube-api-access-fwxzh\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.709981 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85mbt\" (UniqueName: \"kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.714881 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.762117 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: W0220 09:55:30.776564 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1957ac70_30f9_48c2_a82b_72aa3b7a883a.slice/crio-00dee5bbbf4937108f466d79c3bce6f860c5abc7f98c930473f3b307ff612067 WatchSource:0}: Error finding container 00dee5bbbf4937108f466d79c3bce6f860c5abc7f98c930473f3b307ff612067: Status 404 returned error can't find the container with id 00dee5bbbf4937108f466d79c3bce6f860c5abc7f98c930473f3b307ff612067 Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.783459 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.796482 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: W0220 09:55:30.803920 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef72d73c_d177_4436_b681_83866e1f6d12.slice/crio-909b13a1fc9afef6758fb5cfcc8fa0fbd00d1703a6c148635bd41f628e163bc9 WatchSource:0}: Error finding container 909b13a1fc9afef6758fb5cfcc8fa0fbd00d1703a6c148635bd41f628e163bc9: Status 404 returned error can't find the container with id 909b13a1fc9afef6758fb5cfcc8fa0fbd00d1703a6c148635bd41f628e163bc9 Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.806450 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: W0220 09:55:30.829133 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2abd2b70_bb78_49a0_b930_cd066384e803.slice/crio-30d1769bf1e4a85341ca0d75e37166ad7a768dbf64ad246e32c8fde99616e4b7 WatchSource:0}: Error finding container 30d1769bf1e4a85341ca0d75e37166ad7a768dbf64ad246e32c8fde99616e4b7: Status 404 returned error can't find the container with id 30d1769bf1e4a85341ca0d75e37166ad7a768dbf64ad246e32c8fde99616e4b7 Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.064786 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 12:07:52.982524225 +0000 UTC Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.288920 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.290027 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" exitCode=0 Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.290080 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.290097 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"30d1769bf1e4a85341ca0d75e37166ad7a768dbf64ad246e32c8fde99616e4b7"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.293197 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.293245 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.293259 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"46cb7bc029282997298cb9150b87b2ce8241d6d6c942b7c31acc89474cb54917"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.295317 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966" exitCode=0 Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.295396 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.295437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerStarted","Data":"909b13a1fc9afef6758fb5cfcc8fa0fbd00d1703a6c148635bd41f628e163bc9"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.296939 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerStarted","Data":"e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.296981 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerStarted","Data":"00dee5bbbf4937108f466d79c3bce6f860c5abc7f98c930473f3b307ff612067"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.307336 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.321057 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.337360 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.355891 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.386936 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.406877 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.420354 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.438165 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.450662 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.471168 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.483498 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.499395 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.524406 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.539979 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.558906 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.573241 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.585373 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.599039 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.615301 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.630759 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.643943 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.658627 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.677096 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.688918 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.789505 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.789656 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789669 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:35.789648239 +0000 UTC m=+27.372120095 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.789698 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.789741 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.789773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789781 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789823 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789837 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789864 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789877 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789887 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789891 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:35.789880896 +0000 UTC m=+27.372352752 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789914 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:35.789905367 +0000 UTC m=+27.372377213 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789939 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.790024 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:35.79000595 +0000 UTC m=+27.372477796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.790081 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.790113 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:35.790106173 +0000 UTC m=+27.372578089 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.065296 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 13:46:41.710907399 +0000 UTC Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.138653 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.138671 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.138837 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:32 crc kubenswrapper[4962]: E0220 09:55:32.138886 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:32 crc kubenswrapper[4962]: E0220 09:55:32.138983 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:32 crc kubenswrapper[4962]: E0220 09:55:32.139051 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.305853 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.305907 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.305920 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.305931 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.308034 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5" exitCode=0 Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.308141 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5"} Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.329887 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.355751 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.372670 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.390539 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.412366 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.430300 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.432053 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.448392 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.448452 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.448702 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.466358 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.523203 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.563846 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.585569 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.603510 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.625494 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.646375 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.663287 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.677462 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.693268 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.705936 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.706084 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-hxb97"] Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.706814 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.708669 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.709109 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.709202 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.709383 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.720559 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.734309 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.746011 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.755861 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.768585 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.784723 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.795557 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.811633 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.828215 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.842391 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.866326 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.887896 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.902499 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f0e53ce-e004-473e-be85-ef4c83e399c7-host\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.902559 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f0e53ce-e004-473e-be85-ef4c83e399c7-serviceca\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.902610 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c27br\" (UniqueName: \"kubernetes.io/projected/4f0e53ce-e004-473e-be85-ef4c83e399c7-kube-api-access-c27br\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.903092 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.923571 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.937656 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.955845 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.967040 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.977372 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.994069 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.004118 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c27br\" (UniqueName: \"kubernetes.io/projected/4f0e53ce-e004-473e-be85-ef4c83e399c7-kube-api-access-c27br\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.004199 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f0e53ce-e004-473e-be85-ef4c83e399c7-host\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.004241 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f0e53ce-e004-473e-be85-ef4c83e399c7-serviceca\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.004379 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f0e53ce-e004-473e-be85-ef4c83e399c7-host\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.005341 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f0e53ce-e004-473e-be85-ef4c83e399c7-serviceca\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.009136 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.024068 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.024214 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c27br\" (UniqueName: \"kubernetes.io/projected/4f0e53ce-e004-473e-be85-ef4c83e399c7-kube-api-access-c27br\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.066285 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 19:32:12.387745276 +0000 UTC Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.315563 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.315673 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.318224 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f" exitCode=0 Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.318300 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f"} Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.320024 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.335768 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.351842 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.361464 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.377729 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.399371 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.414195 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.429352 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.450960 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.474355 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.491998 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.504524 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.516887 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.529735 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.543962 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.595889 4962 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.067183 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 21:14:40.57364124 +0000 UTC Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.137922 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.138080 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.138144 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.138290 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.138378 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.138619 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.323067 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211" exitCode=0 Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.323167 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.324149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hxb97" event={"ID":"4f0e53ce-e004-473e-be85-ef4c83e399c7","Type":"ContainerStarted","Data":"e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.324193 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hxb97" event={"ID":"4f0e53ce-e004-473e-be85-ef4c83e399c7","Type":"ContainerStarted","Data":"9c3e86a25b4a96a9c1728f4ceba5d23082e5c15f8fc65a3b8b77a765b4d7d893"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.344739 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.365007 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.378665 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.389209 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.401379 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.414138 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.432500 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.436688 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.441953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.441995 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.442007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.442345 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.448938 4962 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.449267 4962 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.451803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.451849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.451864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.451886 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.451899 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.452213 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.465222 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.470867 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.475378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.475422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.475434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.475452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.475462 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.477125 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.489202 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.494027 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.499473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.499530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.499546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.499569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.499582 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.505294 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.516391 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.521262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.521291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.521301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.521317 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.521327 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.523533 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.542050 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.543112 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.545957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.545984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.545992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.546007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.546019 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.554946 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.558215 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.558352 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.558495 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560519 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560843 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560860 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.566293 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.573399 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.584687 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.604573 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.661146 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.663145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.663188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.663202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.663222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.663235 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.677049 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.694193 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.711181 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.732557 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.747875 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.765642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.765670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.765678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.765691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.765699 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.771363 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.783958 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.797787 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.811039 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.825691 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.840004 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.856424 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.868427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.868478 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.868490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.868508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.868521 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.872048 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.886443 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.908129 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.937321 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.960319 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.971628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.971666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.971676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.971691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.971704 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.975736 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.987177 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.996509 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.006530 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.020248 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.038207 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.054271 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.067522 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 22:58:19.125569499 +0000 UTC Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.075763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.075803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.075817 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.075839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.075855 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.179360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.179419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.179436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.179458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.179476 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.284810 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.284895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.284921 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.284958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.284986 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.335833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.339786 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84" exitCode=0 Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.339880 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84"} Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.349416 4962 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.366994 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.388290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.388343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.388356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.388375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.388387 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.392178 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.425527 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.466124 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.483504 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.491348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.491388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.491400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.491418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.491428 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.505241 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.527240 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.543650 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.562209 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.577627 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.594503 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.595229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.595270 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.595282 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.595303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.595318 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.613860 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.629895 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.652268 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.668284 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.697473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.697508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.697521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.697536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.697549 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.799638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.799683 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.799696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.799711 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.799723 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.873767 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.873923 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:43.873897628 +0000 UTC m=+35.456369474 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.873973 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.874020 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.874055 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.874088 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874148 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874179 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874185 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874223 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874235 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874235 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874247 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874249 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874239 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:43.874211818 +0000 UTC m=+35.456683674 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874324 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:43.874316701 +0000 UTC m=+35.456788547 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874334 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:43.874329522 +0000 UTC m=+35.456801368 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874353 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:43.874340182 +0000 UTC m=+35.456812028 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.902269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.902305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.902314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.902328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.902340 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005101 4962 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005731 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.068670 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 22:56:50.834110646 +0000 UTC Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.109499 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.109566 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.109579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.109625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.109637 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.138938 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.139005 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.139018 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:36 crc kubenswrapper[4962]: E0220 09:55:36.139158 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:36 crc kubenswrapper[4962]: E0220 09:55:36.139336 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:36 crc kubenswrapper[4962]: E0220 09:55:36.139528 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.194380 4962 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.212876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.213203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.213335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.213521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.213695 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.316518 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.316849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.316981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.317119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.317732 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.350643 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2" exitCode=0 Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.350784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.370773 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.386564 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.404782 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.419379 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.421868 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.421985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.421999 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.422023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.422036 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.432342 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.450716 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.470010 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.486935 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.501054 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.519671 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.524818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.524869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.524891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.524920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.524940 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.541168 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.560547 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.582719 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.598238 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.613494 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.627205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.627248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.627259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.627276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.627289 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.730343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.730441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.730473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.730509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.730537 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.835203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.835659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.835673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.835693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.835707 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.939295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.939366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.939403 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.939434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.939459 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.042737 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.042806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.042819 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.042842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.042857 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.068863 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 17:00:57.535686837 +0000 UTC Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.145160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.145210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.145225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.145246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.145259 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.248074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.248129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.248141 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.248158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.248174 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.352140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.352192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.352215 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.352251 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.352274 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.360530 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.360833 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.360887 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.367904 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerStarted","Data":"dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.386681 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.401438 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.406949 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.413356 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.437987 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.454274 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.455429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.455490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.455515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.455547 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.455571 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.472176 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.487914 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.510681 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.535966 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.552838 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.558220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.558281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.558305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.558351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.558372 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.568311 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.586570 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.600340 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.622741 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.638280 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.657522 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.661244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.661316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.661343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.661380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.661404 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.680519 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.712025 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.740650 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.765078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.765140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.765161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.765187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.765207 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.793616 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.816475 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.830614 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.843871 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.858207 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.867823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.867856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.867864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.867878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.867890 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.872278 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.883781 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.895469 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.908797 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.922409 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.944682 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.955904 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.972734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.972779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.972790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.972810 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.972826 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.069677 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 23:36:11.659867331 +0000 UTC Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.075989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.076041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.076054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.076077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.076096 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.138512 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:38 crc kubenswrapper[4962]: E0220 09:55:38.138842 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.139714 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:38 crc kubenswrapper[4962]: E0220 09:55:38.139863 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.139963 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:38 crc kubenswrapper[4962]: E0220 09:55:38.140070 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.180199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.180246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.180254 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.180274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.180284 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.299150 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.299199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.299214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.299236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.299247 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.380100 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.402235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.402276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.402290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.402328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.402344 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.505159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.505203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.505211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.505232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.505246 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.608165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.608219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.608230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.608246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.608255 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.711189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.711214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.711225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.711240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.711249 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.813666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.813768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.813787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.813814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.813831 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.917027 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.917091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.917109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.917137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.917155 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.020374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.020450 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.020478 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.020511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.020532 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.070208 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 07:18:38.889332968 +0000 UTC Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.123624 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.123673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.123685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.123702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.123713 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.160534 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.190907 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.219212 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.226967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.227078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.227107 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.227206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.227304 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.240097 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.262015 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.284183 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.306273 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.326571 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.330815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.330884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.330906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.331104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.331138 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.343440 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.357519 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.373113 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.383264 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.396272 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.416536 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.434473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.434521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.434533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.434555 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.434567 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.445845 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.468169 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.537666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.537722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.537733 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.537753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.537765 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.640860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.640906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.640916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.640934 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.640946 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.743682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.743731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.743742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.743759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.743773 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.846221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.846295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.846312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.846345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.846368 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.950020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.950062 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.950073 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.950092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.950106 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.012331 4962 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.053058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.053119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.053192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.053222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.053245 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.071157 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 12:43:56.344870697 +0000 UTC Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.138227 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.138303 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.138240 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:40 crc kubenswrapper[4962]: E0220 09:55:40.138408 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:40 crc kubenswrapper[4962]: E0220 09:55:40.138733 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:40 crc kubenswrapper[4962]: E0220 09:55:40.138585 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.155644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.155716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.155729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.155752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.155769 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.259302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.259400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.259419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.259438 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.259478 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.363115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.363175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.363186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.363207 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.363216 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.467264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.467330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.467344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.467366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.467379 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.571336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.571378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.571409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.571456 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.571470 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.674647 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.674724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.674745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.674775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.674795 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.779273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.779387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.779415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.779454 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.779482 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.882642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.882736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.882762 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.882794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.882819 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.986567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.986831 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.986885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.986961 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.986986 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.071653 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 19:26:48.70432122 +0000 UTC Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.091015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.091112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.091126 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.091156 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.091172 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.108486 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.136842 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.163184 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.185439 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.193631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.193666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.193679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.193697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.193709 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.203425 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.225033 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.247321 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.280790 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.296958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.297028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.297050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.297080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.297099 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.307428 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.324959 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.338696 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.362408 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.385806 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.397035 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/0.log" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.399543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.399579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.399611 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.399630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.399644 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.401657 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f" exitCode=1 Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.401699 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.402511 4962 scope.go:117] "RemoveContainer" containerID="1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.419584 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.442303 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.461113 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.485204 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.503736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.503796 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.503819 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.503848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.503867 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.511745 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.542467 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.561470 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.577071 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.593128 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.605584 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.606159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.606239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.606259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.606285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.606305 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.616423 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.627865 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.637250 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.647195 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.660259 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.671823 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.684946 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.699677 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.709802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.709846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.709856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.709877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.709889 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.812903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.812973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.812994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.813022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.813042 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.916184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.916230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.916244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.916266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.916280 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.019460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.019503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.019517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.019538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.019554 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.072891 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 22:38:13.928021397 +0000 UTC Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.121985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.122042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.122056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.122078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.122119 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.138691 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.138762 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:42 crc kubenswrapper[4962]: E0220 09:55:42.138807 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.138691 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:42 crc kubenswrapper[4962]: E0220 09:55:42.139003 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:42 crc kubenswrapper[4962]: E0220 09:55:42.138922 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.225195 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.225243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.225255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.225275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.225291 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.328088 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.328136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.328149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.328171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.328184 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.409613 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/0.log" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.412633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.412799 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.429379 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.431384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.431447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.431466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.431859 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.431885 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.445472 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.468905 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.489972 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.504546 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.518071 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.531909 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.535173 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.535224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.535239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.535262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.535282 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.553151 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.570087 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.583101 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.599079 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.611863 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.624160 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.637695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.637738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.637749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.637764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.637775 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.644236 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.655981 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.740657 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.740695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.740705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.740720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.740730 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.829443 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf"] Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.830101 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.832756 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.833675 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843666 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx5m2\" (UniqueName: \"kubernetes.io/projected/8526746c-450b-4df8-8ea1-f0cbabd13894-kube-api-access-tx5m2\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843729 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8526746c-450b-4df8-8ea1-f0cbabd13894-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843769 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.844173 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-env-overrides\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.853577 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.870437 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.895561 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.932707 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.945101 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx5m2\" (UniqueName: \"kubernetes.io/projected/8526746c-450b-4df8-8ea1-f0cbabd13894-kube-api-access-tx5m2\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.945161 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8526746c-450b-4df8-8ea1-f0cbabd13894-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.945194 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.945229 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-env-overrides\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.946153 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-env-overrides\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.946329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.947144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.947183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.947223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.947245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.947259 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.953261 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8526746c-450b-4df8-8ea1-f0cbabd13894-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.956340 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.967761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx5m2\" (UniqueName: \"kubernetes.io/projected/8526746c-450b-4df8-8ea1-f0cbabd13894-kube-api-access-tx5m2\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.970554 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.988437 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.008344 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.022390 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.039969 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.051402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.051477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.051500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.051933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.052190 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.059763 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.073116 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 05:28:54.14387113 +0000 UTC Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.077407 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.098091 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.120019 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.142070 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.147290 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.155473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.155520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.155534 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.155555 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.155575 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.161685 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: W0220 09:55:43.168710 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8526746c_450b_4df8_8ea1_f0cbabd13894.slice/crio-1f73fe569d4cc78f08fa0abc2c32bc37e40f133cd0b63236d631808fa9e455ae WatchSource:0}: Error finding container 1f73fe569d4cc78f08fa0abc2c32bc37e40f133cd0b63236d631808fa9e455ae: Status 404 returned error can't find the container with id 1f73fe569d4cc78f08fa0abc2c32bc37e40f133cd0b63236d631808fa9e455ae Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.258661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.258722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.258738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.258762 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.258776 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.362102 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.362169 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.362197 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.362230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.362253 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.419166 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" event={"ID":"8526746c-450b-4df8-8ea1-f0cbabd13894","Type":"ContainerStarted","Data":"1f73fe569d4cc78f08fa0abc2c32bc37e40f133cd0b63236d631808fa9e455ae"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.422133 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/1.log" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.423171 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/0.log" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.428483 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1" exitCode=1 Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.428543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.428648 4962 scope.go:117] "RemoveContainer" containerID="1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.429541 4962 scope.go:117] "RemoveContainer" containerID="cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.429768 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.452124 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.465935 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.466003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.466024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.466054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.466075 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.468785 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.488145 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.505789 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.525663 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.550967 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.568526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.568730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.568831 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.568922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.569003 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.573652 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.595666 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.618012 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.641407 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.660301 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.673190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.673286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.673311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.673828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.674066 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.676195 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.704506 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.722269 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.744980 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.767841 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.776890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.776954 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.776973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.777005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.777027 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.880055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.880091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.880103 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.880123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.880134 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.948742 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5bwk2"] Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.949286 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.949366 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.956523 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.956812 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.956764008 +0000 UTC m=+51.539235894 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.956911 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.957031 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjn55\" (UniqueName: \"kubernetes.io/projected/d590527b-ed56-4fb4-a712-b09781618a76-kube-api-access-jjn55\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957082 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957196 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.95716031 +0000 UTC m=+51.539632186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.957086 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957311 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.957433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957529 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957539 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.957497831 +0000 UTC m=+51.539969867 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957557 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957583 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.957666 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957735 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.957714468 +0000 UTC m=+51.540186324 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.957812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.958011 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.958037 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.958053 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.958151 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.95809943 +0000 UTC m=+51.540571296 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.964319 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.983743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.983790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.983805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.983830 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.983847 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.985419 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.019985 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.038501 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.058497 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.058583 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjn55\" (UniqueName: \"kubernetes.io/projected/d590527b-ed56-4fb4-a712-b09781618a76-kube-api-access-jjn55\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.058745 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.058851 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:44.558824749 +0000 UTC m=+36.141296595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.064571 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.074059 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 05:26:37.704482081 +0000 UTC Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.077773 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjn55\" (UniqueName: \"kubernetes.io/projected/d590527b-ed56-4fb4-a712-b09781618a76-kube-api-access-jjn55\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.084562 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.086509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.086564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.086578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.086621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.086638 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.104393 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.117046 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.133763 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.138361 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.138556 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.138397 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.138699 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.138373 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.138780 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.150230 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.164687 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.177629 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.189020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.189076 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.189090 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.189109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.189122 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.193922 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.207311 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.226065 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.246936 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.259241 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.292910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.292956 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.292968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.292986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.292997 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.396096 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.396163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.396187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.396223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.396249 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.436992 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/1.log" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.444877 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" event={"ID":"8526746c-450b-4df8-8ea1-f0cbabd13894","Type":"ContainerStarted","Data":"84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.444960 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" event={"ID":"8526746c-450b-4df8-8ea1-f0cbabd13894","Type":"ContainerStarted","Data":"c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.469671 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.492988 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.498727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.498790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.498799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.498817 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.498827 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.511886 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.534181 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.552657 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.563750 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.564060 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.564241 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:45.564204846 +0000 UTC m=+37.146676732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.579878 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.587205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.587268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.587279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.587297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.587309 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.606849 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.607446 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.611757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.611816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.611834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.611862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.611883 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.625378 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.632861 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.637654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.637705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.637724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.637751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.637771 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.641653 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.661254 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.671434 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.676861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.676911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.676925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.676950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.676964 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.692422 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.703176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.703221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.703230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.703249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.703259 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.705584 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.719446 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.719632 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.721899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.721931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.721940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.721957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.721969 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.729054 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.748803 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.762442 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.775129 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.793871 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.806366 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.824367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.824404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.824415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.824436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.824450 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.927910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.927945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.927954 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.927969 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.927979 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.030518 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.030557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.030567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.030583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.030609 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.074590 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 16:17:30.384275256 +0000 UTC Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.135118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.135179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.135198 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.135228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.135250 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.238505 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.238554 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.238567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.238586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.238640 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.341570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.341633 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.341649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.341668 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.341678 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.446378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.446510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.446530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.447201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.447343 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.551725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.552664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.552690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.552708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.552718 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.573652 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:45 crc kubenswrapper[4962]: E0220 09:55:45.573900 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:45 crc kubenswrapper[4962]: E0220 09:55:45.573976 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:47.573953518 +0000 UTC m=+39.156425394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.657154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.657241 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.657261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.657288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.657309 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.760569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.760693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.760721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.760760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.760778 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.864562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.864674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.864702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.864733 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.864754 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.967128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.967418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.967506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.967583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.967680 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.069550 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.069639 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.069657 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.069682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.069699 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.075703 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 14:41:32.421434868 +0000 UTC Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.137798 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.137815 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.137834 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.137923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:46 crc kubenswrapper[4962]: E0220 09:55:46.138569 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:46 crc kubenswrapper[4962]: E0220 09:55:46.138390 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:46 crc kubenswrapper[4962]: E0220 09:55:46.138200 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:46 crc kubenswrapper[4962]: E0220 09:55:46.138883 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.178558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.178658 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.178679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.178710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.178744 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.282250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.282338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.282364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.282405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.282433 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.386858 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.386932 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.386953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.386982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.387002 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.490543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.490919 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.491122 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.491288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.491481 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.595341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.595410 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.595431 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.595458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.595480 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.699178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.699243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.699263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.699291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.699312 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.802864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.802959 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.802983 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.803026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.803052 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.907178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.907259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.907276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.907309 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.907328 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.011648 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.011723 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.011745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.011775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.011797 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.076118 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 18:52:26.030689126 +0000 UTC Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.115170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.115228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.115246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.115271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.115291 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.219290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.219342 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.219362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.219390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.219409 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.322355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.322423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.322440 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.322463 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.322483 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.404556 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.406011 4962 scope.go:117] "RemoveContainer" containerID="cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1" Feb 20 09:55:47 crc kubenswrapper[4962]: E0220 09:55:47.406299 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425731 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425827 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425950 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.439712 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.458397 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.481363 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.503641 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529121 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529074 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529253 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.545485 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.563433 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.600003 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.600259 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:47 crc kubenswrapper[4962]: E0220 09:55:47.600464 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:47 crc kubenswrapper[4962]: E0220 09:55:47.600566 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:51.6005354 +0000 UTC m=+43.183007276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.624895 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.633023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.633113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.633140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.633175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.633206 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.650397 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.674333 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.694215 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.715436 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.736298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.736358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.736378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.736404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.736423 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.740684 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.771952 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.784975 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.840903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.840967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.840985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.841037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.841058 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.943840 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.943888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.943899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.943918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.943929 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.046827 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.046880 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.046894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.046918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.046933 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.077165 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 21:40:54.047399832 +0000 UTC Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.138791 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.138828 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:48 crc kubenswrapper[4962]: E0220 09:55:48.139036 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.139078 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.139385 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:48 crc kubenswrapper[4962]: E0220 09:55:48.139495 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:48 crc kubenswrapper[4962]: E0220 09:55:48.139370 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:48 crc kubenswrapper[4962]: E0220 09:55:48.139636 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.149561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.149714 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.149796 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.149871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.149960 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.252662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.252977 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.253136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.253352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.253496 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.357558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.357623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.357632 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.357650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.357661 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.460469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.460529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.460544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.460569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.460581 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.564120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.564171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.564182 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.564200 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.564213 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.667407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.667470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.667488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.667514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.667536 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.770808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.770869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.770888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.770915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.770936 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.874113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.874181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.874208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.874241 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.874264 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.977249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.977332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.977352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.977382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.977406 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.077923 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 14:42:31.265147247 +0000 UTC Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.080212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.080279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.080301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.080329 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.080355 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.164443 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.183306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.183364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.183379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.183409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.183428 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.186873 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.206649 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.224509 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.241343 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.259938 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.286020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.286065 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.286081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.286105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.286124 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.292373 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.328376 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.346215 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.361802 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.374703 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.389183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.389230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.389239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.389257 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.389270 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.390152 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.402189 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.415856 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.428341 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.440389 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.454315 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.491578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.491656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.491670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.491691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.491705 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.595047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.595108 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.595128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.595157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.595175 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.700034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.700102 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.700123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.700153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.700174 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.802997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.803075 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.803101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.803137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.803164 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.906871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.906940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.906960 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.907000 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.907023 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.011377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.011439 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.011462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.011490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.011511 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.078218 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 06:31:45.69149639 +0000 UTC Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.114008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.114228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.114289 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.114352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.114413 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.138721 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.138868 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.138781 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.138779 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:50 crc kubenswrapper[4962]: E0220 09:55:50.139141 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:50 crc kubenswrapper[4962]: E0220 09:55:50.139334 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:50 crc kubenswrapper[4962]: E0220 09:55:50.139485 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:50 crc kubenswrapper[4962]: E0220 09:55:50.139665 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.218008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.218068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.218085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.218114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.218135 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.320068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.320108 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.320119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.320133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.320141 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.423210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.423287 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.423300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.423328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.423342 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.527046 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.527121 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.527145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.527178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.527200 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.630673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.630741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.630761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.630789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.630807 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.733561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.733623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.733635 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.733680 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.733695 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.836899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.836997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.837018 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.837047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.837067 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.940666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.940748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.940767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.940795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.940816 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.044193 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.044513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.044634 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.044764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.044857 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.079534 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 18:13:35.979005054 +0000 UTC Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.147938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.148119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.148221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.148324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.148429 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.251127 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.251195 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.251210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.251232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.251253 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.354412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.354909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.354938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.354969 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.355004 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.459899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.459968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.459986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.460016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.460035 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.564179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.564253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.564271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.564301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.564340 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.649111 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:51 crc kubenswrapper[4962]: E0220 09:55:51.649371 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:51 crc kubenswrapper[4962]: E0220 09:55:51.649522 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.649488121 +0000 UTC m=+51.231960007 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.668122 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.668241 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.668268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.668303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.668326 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.771267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.771338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.771356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.771383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.771406 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.875179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.875272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.875296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.875332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.875360 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.979480 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.979545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.979562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.979587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.979638 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.080792 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 01:22:06.20025467 +0000 UTC Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.083202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.083240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.083252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.083272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.083287 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.138464 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:52 crc kubenswrapper[4962]: E0220 09:55:52.138722 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.138757 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.138915 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:52 crc kubenswrapper[4962]: E0220 09:55:52.139121 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.139180 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:52 crc kubenswrapper[4962]: E0220 09:55:52.139238 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:52 crc kubenswrapper[4962]: E0220 09:55:52.139301 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.186367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.186424 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.186441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.186479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.186498 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.290061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.290125 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.290147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.290171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.290190 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.394158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.394493 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.394627 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.394734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.394841 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.497540 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.498104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.498131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.498155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.498170 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.601562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.601660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.601680 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.601706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.601728 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.705327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.705389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.705408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.705427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.705441 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.808334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.808473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.808500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.808536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.808562 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.915927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.916654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.916742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.916788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.916812 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.020225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.020311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.020332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.020363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.020384 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.081488 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 15:23:05.928269323 +0000 UTC Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.123388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.123450 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.123470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.123495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.123512 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.227417 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.227490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.227505 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.227531 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.227551 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.330497 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.330532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.330542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.330558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.330568 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.433357 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.433411 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.433427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.433464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.433515 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.536953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.537030 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.537053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.537089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.537113 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.641447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.641529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.641549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.641584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.641649 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.744689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.744760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.744779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.744809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.744830 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.848867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.848939 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.848958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.848987 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.849005 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.952717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.952792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.952818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.952854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.952877 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.057019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.057115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.057141 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.057176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.057198 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.082469 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 07:04:25.687432287 +0000 UTC Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.138201 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.138348 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.138381 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:54 crc kubenswrapper[4962]: E0220 09:55:54.138405 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.138348 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:54 crc kubenswrapper[4962]: E0220 09:55:54.138679 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:54 crc kubenswrapper[4962]: E0220 09:55:54.138972 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:54 crc kubenswrapper[4962]: E0220 09:55:54.139105 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.160756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.160815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.160835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.160928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.160952 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.264522 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.264641 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.264658 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.264678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.264692 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.368695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.368744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.368755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.368778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.368790 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.471931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.471975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.471992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.472014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.472027 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.574745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.574812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.574832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.574857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.574881 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.679366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.679452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.679473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.679510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.679536 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.783744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.784267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.784355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.784474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.784575 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.888474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.888526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.888543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.888567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.888584 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.948565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.948673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.948692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.948722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.948742 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: E0220 09:55:54.973327 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:54Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.986183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.986250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.986272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.986299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.986321 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: E0220 09:55:55.011332 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:55Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.017120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.017191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.017218 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.017252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.017279 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: E0220 09:55:55.041262 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:55Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.047722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.047795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.047816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.047847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.047870 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: E0220 09:55:55.068339 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:55Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.074495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.074549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.074573 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.074630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.074648 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.083631 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 09:19:47.94018627 +0000 UTC Feb 20 09:55:55 crc kubenswrapper[4962]: E0220 09:55:55.094396 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:55Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:55 crc kubenswrapper[4962]: E0220 09:55:55.094662 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.097722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.097795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.097838 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.097882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.097912 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.201445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.201511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.201538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.201570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.201626 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.305765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.305832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.305854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.305888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.305907 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.409192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.409273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.409292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.409321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.409341 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.511906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.511993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.512012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.512042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.512062 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.615480 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.615535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.615546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.615566 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.615580 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.719055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.719118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.719128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.719149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.719161 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.823443 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.823525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.823543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.823573 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.823617 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.927437 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.927506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.927528 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.927564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.927631 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.031653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.031727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.031746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.031775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.031800 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.084561 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 23:55:31.053817176 +0000 UTC Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.135564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.135669 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.135689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.135716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.135736 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.137887 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.138020 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.137889 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:56 crc kubenswrapper[4962]: E0220 09:55:56.138109 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.138033 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:56 crc kubenswrapper[4962]: E0220 09:55:56.138310 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:56 crc kubenswrapper[4962]: E0220 09:55:56.138469 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:56 crc kubenswrapper[4962]: E0220 09:55:56.138657 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.239243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.239315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.239336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.239363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.239386 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.343695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.343768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.343786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.343815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.343835 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.448099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.448184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.448205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.448239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.448262 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.557938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.558099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.558128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.558161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.558182 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.662039 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.662115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.662133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.662161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.662180 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.765299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.765373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.765390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.765419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.765441 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.869881 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.869944 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.869961 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.869984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.870002 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.973145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.973208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.973225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.973252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.973270 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.077069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.077146 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.077165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.077214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.077240 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.085261 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 13:00:36.829955292 +0000 UTC Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.180991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.181103 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.181136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.181180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.181208 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.285155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.285289 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.285369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.285398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.285419 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.388978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.389048 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.389066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.389093 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.389112 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.493013 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.493097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.493116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.493155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.493194 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.596420 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.596514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.596539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.596569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.596589 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.700298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.700348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.700361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.700381 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.700396 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.803029 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.803099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.803120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.803150 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.803171 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.905998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.906081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.906104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.906135 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.906155 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.010517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.010583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.010641 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.010676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.010698 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.085918 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 23:44:19.482540438 +0000 UTC Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.114380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.114455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.114481 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.114523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.114550 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.138933 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.138933 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.138998 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.139167 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:58 crc kubenswrapper[4962]: E0220 09:55:58.139263 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:58 crc kubenswrapper[4962]: E0220 09:55:58.139448 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:58 crc kubenswrapper[4962]: E0220 09:55:58.139659 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:58 crc kubenswrapper[4962]: E0220 09:55:58.139808 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.219371 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.219500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.219558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.219642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.219677 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.324080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.324147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.324164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.324189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.324210 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.427356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.427433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.427456 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.427487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.427507 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.530452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.530541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.530565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.530642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.530668 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.634469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.634553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.634572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.634674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.634711 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.738375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.738436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.738454 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.738487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.738510 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.841791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.841847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.841861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.841884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.841901 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.945073 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.945142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.945156 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.945178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.945192 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.048201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.048275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.048299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.048331 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.048352 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.086638 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 16:58:53.337580805 +0000 UTC Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.152401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.152462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.152483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.152509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.152530 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.161195 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.180490 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.207498 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.234230 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.254945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.255028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.255056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.255092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.255119 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.258679 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.278647 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.300070 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.315476 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.338925 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.358318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.358369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.358382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.358427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.358438 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.361903 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.376796 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.407098 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.445186 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.461717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.461773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.461789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.461820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.461833 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.473788 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.497328 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.523190 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.538283 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.565091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.565157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.565172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.565198 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.565215 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.652695 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.652911 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.653016 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:15.652987722 +0000 UTC m=+67.235459598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.669160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.669224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.669262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.669299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.669320 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.773264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.773337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.773348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.773367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.773382 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.817103 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.831288 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.831367 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.862012 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.875993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.876041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.876055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.876084 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.876100 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.897318 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.920857 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.939997 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.957916 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.958138 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958241 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958260 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:56:31.958210134 +0000 UTC m=+83.540682010 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958328 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:31.958303167 +0000 UTC m=+83.540775043 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.958429 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.958514 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958737 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958799 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958856 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958873 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958915 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:31.958870665 +0000 UTC m=+83.541342571 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958954 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:31.958938817 +0000 UTC m=+83.541410923 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.961457 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.980216 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.980278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.980297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.980324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.980343 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.984742 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.004677 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.024046 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.041938 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.057789 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.059775 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.060102 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.060171 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.060200 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.060309 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:32.060275955 +0000 UTC m=+83.642747841 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.070672 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.082582 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.083637 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.083684 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.083699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.083720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.083734 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.087250 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 16:04:06.111427565 +0000 UTC Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.108400 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.128766 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.138525 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.138705 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.138816 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.138978 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.139041 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.139117 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.139207 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.139360 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.140584 4962 scope.go:117] "RemoveContainer" containerID="cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.153131 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.173038 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.186833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.186911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.186930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.186960 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.186984 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.290385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.290537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.290675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.291000 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.291165 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.397220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.397273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.397287 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.397307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.397320 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.501793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.501846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.501864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.501897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.501911 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.521388 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/1.log" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.525842 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.527002 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.553233 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.575550 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.599185 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.605258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.605319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.605338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.605365 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.605383 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.622936 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.672155 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.702886 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.709130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.709172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.709188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.709212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.709223 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.724426 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.741364 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.757265 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.771653 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.785661 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.803166 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.812139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.812220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.812243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.812279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.812304 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.823741 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.837003 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.850917 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.867470 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.883741 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.896579 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.914947 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.915002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.915016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.915036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.915051 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.019137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.019201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.019226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.019244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.019256 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.088259 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 12:40:06.042929018 +0000 UTC Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.122012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.122088 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.122108 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.122138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.122160 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.224660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.224709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.224719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.224740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.224757 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.328702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.328768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.328793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.328826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.328847 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.433498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.433577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.433625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.433652 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.433674 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.534670 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/2.log" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.535834 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/1.log" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.537031 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.537107 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.537129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.537159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.537179 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.541462 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" exitCode=1 Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.541536 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.541652 4962 scope.go:117] "RemoveContainer" containerID="cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.542919 4962 scope.go:117] "RemoveContainer" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" Feb 20 09:56:01 crc kubenswrapper[4962]: E0220 09:56:01.543252 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.568218 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.592544 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.616403 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.637662 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.640243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.640449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.640664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.640850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.641007 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.659052 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.677681 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.719914 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.744886 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.744968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.744997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.745034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.745059 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.749393 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.778922 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.800995 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.819719 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.839517 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.848905 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.848979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.849003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.849031 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.849052 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.862214 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.896631 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.916513 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.938794 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.952408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.952477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.952498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.952541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.952565 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.958256 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.975839 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.057119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.057202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.057220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.057249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.057270 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.089261 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 04:24:20.884442287 +0000 UTC Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.138372 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.138501 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.138375 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.138529 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:02 crc kubenswrapper[4962]: E0220 09:56:02.138542 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:02 crc kubenswrapper[4962]: E0220 09:56:02.139060 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:02 crc kubenswrapper[4962]: E0220 09:56:02.139102 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:02 crc kubenswrapper[4962]: E0220 09:56:02.139215 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.161409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.161475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.161495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.161523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.161545 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.265100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.265153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.265167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.265188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.265206 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.369098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.369180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.369200 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.369226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.369243 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.472892 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.472943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.472955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.472974 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.472989 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.548440 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/2.log" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.554153 4962 scope.go:117] "RemoveContainer" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" Feb 20 09:56:02 crc kubenswrapper[4962]: E0220 09:56:02.554451 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.574998 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.576847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.576902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.576915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.576938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.576953 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.597911 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.634560 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.664830 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.681091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.681150 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.681167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.681196 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.681217 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.685125 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.706819 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.726318 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.745353 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.763907 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.784224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.784283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.784304 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.784334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.784354 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.791868 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.807031 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.822496 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.836093 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.848507 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.867233 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.884236 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.892890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.893004 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.893039 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.893089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.893120 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.908044 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.925846 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.996889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.996936 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.996950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.996980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.996997 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.090169 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 12:24:29.16089228 +0000 UTC Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.100414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.100483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.100503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.100532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.100558 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.204010 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.204109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.204136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.204175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.204196 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.308528 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.308637 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.308666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.308705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.308735 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.415452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.415541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.415567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.415638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.415667 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.519130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.519211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.519231 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.519266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.519291 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.622925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.622985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.623002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.623027 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.623046 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.726696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.726766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.726812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.726841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.726864 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.830675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.830748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.830766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.830820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.830840 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.934234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.934292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.934311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.934337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.934356 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.039639 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.039706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.039724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.039751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.039770 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.091018 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 23:18:20.472944933 +0000 UTC Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.138330 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.138359 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.138585 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:04 crc kubenswrapper[4962]: E0220 09:56:04.138665 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:04 crc kubenswrapper[4962]: E0220 09:56:04.138891 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:04 crc kubenswrapper[4962]: E0220 09:56:04.139091 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.139304 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:04 crc kubenswrapper[4962]: E0220 09:56:04.139744 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.142930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.143052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.143133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.143248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.143348 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.247211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.247274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.247292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.247320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.247343 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.350314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.350376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.350396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.350422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.350440 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.454068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.454132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.454141 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.454164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.454179 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.558653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.558745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.558755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.558773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.558782 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.662015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.662068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.662080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.662106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.662121 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.766466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.766538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.766567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.766625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.766646 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.870225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.870297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.870314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.870340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.870360 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.979737 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.979818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.979839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.979868 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.979888 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.084293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.084358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.084375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.084402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.084421 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.091549 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:27:14.20778628 +0000 UTC Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.187328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.187387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.187404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.187429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.187449 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.291406 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.291473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.291495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.291525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.291545 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.354548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.354689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.354715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.354744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.354764 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.375706 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:05Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.382890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.382982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.383004 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.383032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.383057 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.401866 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:05Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.407632 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.407715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.407735 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.407765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.407789 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.427209 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:05Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.431399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.431431 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.431444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.431462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.431475 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.450341 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:05Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.456074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.456118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.456133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.456152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.456165 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.473669 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:05Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.473826 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.476148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.476214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.476233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.476264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.476283 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.578671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.578717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.578729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.578748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.578762 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.681694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.681757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.681775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.681800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.681817 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.785503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.785576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.785632 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.785664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.785682 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.889052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.889120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.889139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.889165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.889185 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.993387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.993488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.993514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.993549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.993572 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.092372 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 17:52:43.399918635 +0000 UTC Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.097426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.097496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.097510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.097538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.097554 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.138025 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.138120 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.138186 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.138318 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:06 crc kubenswrapper[4962]: E0220 09:56:06.138326 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:06 crc kubenswrapper[4962]: E0220 09:56:06.138485 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:06 crc kubenswrapper[4962]: E0220 09:56:06.138664 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:06 crc kubenswrapper[4962]: E0220 09:56:06.138784 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.201524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.201584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.201630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.201655 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.201673 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.305783 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.305839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.305857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.305882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.305901 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.409667 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.410020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.410092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.410178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.410251 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.514024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.514086 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.514103 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.514128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.514150 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.617582 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.617712 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.617731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.617765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.617788 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.721514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.721649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.721679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.721719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.721746 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.824549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.824660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.824679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.824712 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.824773 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.927716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.927785 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.927805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.927830 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.927850 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.032069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.032134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.032151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.032176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.032195 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.093349 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 23:59:05.065333238 +0000 UTC Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.135638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.135751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.135782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.135813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.135831 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.240365 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.240435 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.240453 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.240479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.240498 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.344288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.344361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.344384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.344419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.344443 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.448576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.448673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.448732 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.448762 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.448784 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.552063 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.552092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.552102 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.552118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.552128 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.655424 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.655475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.655490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.655511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.655527 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.758159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.758221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.758239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.758271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.758290 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.861493 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.861544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.861557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.861578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.861616 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.965278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.965348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.965364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.965392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.965410 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.069167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.069248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.069267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.069296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.069310 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.094530 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 23:45:16.702331302 +0000 UTC Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.138334 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.138485 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:08 crc kubenswrapper[4962]: E0220 09:56:08.138532 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.138495 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.138485 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:08 crc kubenswrapper[4962]: E0220 09:56:08.138764 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:08 crc kubenswrapper[4962]: E0220 09:56:08.138884 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:08 crc kubenswrapper[4962]: E0220 09:56:08.139064 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.173066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.173118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.173131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.173153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.173168 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.282692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.282785 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.282817 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.282851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.282874 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.388720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.389072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.389335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.389514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.389752 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.493810 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.493887 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.493907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.493945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.493968 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.597320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.597384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.597402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.597433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.597458 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.701335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.701396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.701414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.701439 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.701457 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.805490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.805567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.805586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.805645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.805665 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.909214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.909260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.909269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.909286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.909299 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.013021 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.013092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.013111 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.013138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.013157 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.095781 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 04:21:55.159768185 +0000 UTC Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.117028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.117075 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.117086 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.117108 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.117121 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.160068 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.177364 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.197667 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.220072 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.221315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.221346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.221357 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.221378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.221391 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.255245 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.293302 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.311555 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.325336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.325423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.325451 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.325491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.325518 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.328270 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.346382 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.362910 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.378529 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.394481 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.412093 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.427958 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.429457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.429517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.429541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.429576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.429635 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.444837 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.466272 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.489369 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.506732 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.532980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.533019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.533035 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.533059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.533074 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.636551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.636696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.636730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.636767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.636791 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.740549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.740655 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.740731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.740768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.740789 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.844356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.844426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.844449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.844480 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.844502 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.949311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.949385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.949408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.949441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.949464 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.053142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.053198 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.053219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.053245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.053263 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.096729 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 20:20:51.506824202 +0000 UTC Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.138486 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.138528 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:10 crc kubenswrapper[4962]: E0220 09:56:10.138773 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:10 crc kubenswrapper[4962]: E0220 09:56:10.138907 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.139124 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:10 crc kubenswrapper[4962]: E0220 09:56:10.139273 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.139565 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:10 crc kubenswrapper[4962]: E0220 09:56:10.139880 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.156940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.157001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.157020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.157054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.157078 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.260722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.260780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.260800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.260829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.260848 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.363822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.363936 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.363954 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.363980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.364059 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.467864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.467921 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.467937 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.467964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.467982 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.572180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.572253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.572270 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.572296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.572318 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.675253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.675333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.675353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.675382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.675402 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.779933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.780001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.780018 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.780043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.780059 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.882895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.882966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.882981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.883361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.883383 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.987535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.987661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.987679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.987736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.987755 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.091770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.091852 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.091870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.091925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.091944 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.097256 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 04:44:22.325489916 +0000 UTC Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.195295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.195362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.195385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.195412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.195433 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.297851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.297907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.297925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.297955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.297974 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.400549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.400624 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.400673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.400700 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.400718 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.504142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.504189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.504207 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.504234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.504251 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.607255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.607303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.607320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.607346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.607363 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.710301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.710379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.710395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.710420 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.710439 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.814525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.814642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.814671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.814705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.814730 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.918008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.918087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.918125 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.918152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.918173 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.021080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.021161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.021186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.021256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.021313 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.097907 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 12:57:41.425293567 +0000 UTC Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.124179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.124239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.124258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.124300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.124319 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.138646 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.138713 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.138738 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.138678 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:12 crc kubenswrapper[4962]: E0220 09:56:12.138858 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:12 crc kubenswrapper[4962]: E0220 09:56:12.139029 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:12 crc kubenswrapper[4962]: E0220 09:56:12.139290 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:12 crc kubenswrapper[4962]: E0220 09:56:12.139322 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.227228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.227288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.227307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.227331 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.227348 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.330488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.330540 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.330557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.330583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.330633 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.433924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.433989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.434007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.434034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.434052 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.540355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.540431 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.540445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.540464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.540478 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.643200 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.643236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.643246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.643264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.643274 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.745456 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.745505 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.745516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.745535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.745545 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.847716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.847750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.847758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.847773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.847782 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.950045 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.950080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.950090 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.950110 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.950124 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.052255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.052294 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.052308 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.052327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.052337 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.099046 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 20:31:06.910593073 +0000 UTC Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.154202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.154224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.154232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.154244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.154253 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.257432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.257470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.257481 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.257499 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.257512 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.360715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.360770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.360788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.360816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.360834 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.463661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.463706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.463715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.463734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.463746 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.567464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.567533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.567552 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.567578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.567624 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.670387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.670421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.670431 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.670449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.670458 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.773360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.773404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.773430 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.773449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.773462 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.876902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.876964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.876982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.877003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.877016 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.979530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.979578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.979601 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.979621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.979638 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.082322 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.082386 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.082396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.082414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.082425 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.099574 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:11:27.900707408 +0000 UTC Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.138683 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.138730 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.138760 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.138694 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:14 crc kubenswrapper[4962]: E0220 09:56:14.138841 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:14 crc kubenswrapper[4962]: E0220 09:56:14.138928 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:14 crc kubenswrapper[4962]: E0220 09:56:14.139020 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:14 crc kubenswrapper[4962]: E0220 09:56:14.139192 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.184335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.184375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.184385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.184404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.184417 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.288097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.288143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.288153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.288172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.288182 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.391405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.391447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.391458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.391476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.391491 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.493953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.494009 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.494022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.494043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.494058 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.596454 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.596510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.596523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.596545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.596559 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.698656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.698711 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.698724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.698746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.698761 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.804767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.804917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.804962 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.805004 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.805030 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.908814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.908864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.908903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.908923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.908931 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.011805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.011842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.011853 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.011891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.011901 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.099817 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 08:29:47.819138235 +0000 UTC Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.114867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.114924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.114936 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.114974 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.114989 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.139285 4962 scope.go:117] "RemoveContainer" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.139461 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.218498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.218572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.218590 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.218640 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.218658 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.321217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.321261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.321272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.321292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.321302 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.424419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.424460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.424471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.424491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.424504 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.527206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.527244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.527252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.527275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.527287 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.630253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.630301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.630327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.630349 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.630360 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.661115 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.661396 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.661471 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:47.661449782 +0000 UTC m=+99.243921668 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.691674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.691794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.691813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.691844 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.691888 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.705697 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:15Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.709985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.710041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.710059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.710085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.710102 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.722749 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:15Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.726806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.726869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.726903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.726950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.726968 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.739337 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:15Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.744459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.744539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.744559 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.744586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.744630 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.758681 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:15Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.762256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.762294 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.762311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.762333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.762350 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.776389 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:15Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.776649 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.778705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.778740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.778750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.778770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.778781 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.881542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.881576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.881586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.881619 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.881630 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.985471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.985537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.985560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.985642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.985672 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.098520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.098556 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.098568 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.098587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.098617 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.100250 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 19:48:30.592522417 +0000 UTC Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.138682 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:16 crc kubenswrapper[4962]: E0220 09:56:16.138832 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.139035 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:16 crc kubenswrapper[4962]: E0220 09:56:16.139086 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.139186 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:16 crc kubenswrapper[4962]: E0220 09:56:16.139234 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.139345 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:16 crc kubenswrapper[4962]: E0220 09:56:16.139401 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.201235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.201312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.201323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.201343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.201366 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.302930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.302967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.302976 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.302993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.303003 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.405928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.406016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.406032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.406054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.406073 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.508275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.508316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.508328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.508353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.508364 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.610479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.610567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.610583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.610616 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.610630 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.713681 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.713729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.713742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.713760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.713772 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.816174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.816230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.816243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.816264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.816275 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.918777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.918824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.918835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.918853 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.918868 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.021387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.021444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.021457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.021473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.021485 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.101243 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 12:53:44.286619126 +0000 UTC Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.124761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.124801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.124816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.124834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.124844 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.227151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.227219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.227238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.227264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.227281 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.329822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.329899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.329916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.329938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.329972 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.432673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.432719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.432731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.432750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.432766 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.535324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.535409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.535437 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.535469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.535493 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.623276 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/0.log" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.623336 4962 generic.go:334] "Generic (PLEG): container finished" podID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" containerID="e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661" exitCode=1 Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.623377 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerDied","Data":"e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.623937 4962 scope.go:117] "RemoveContainer" containerID="e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.636739 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.638572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.638846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.638865 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.638893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.638909 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.651225 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.665181 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.676805 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.690895 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.700987 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.714351 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.725294 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.741240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.741293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.741305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.741325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.741342 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.745774 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.766633 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.780629 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.792579 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.806647 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.820864 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.831376 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.844426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.844706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.844804 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.844898 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.844994 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.847193 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.859098 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.870765 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.948232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.948575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.948692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.948791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.948885 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.051911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.051947 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.051959 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.051979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.051990 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.101838 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 19:33:47.892218583 +0000 UTC Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.138227 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:18 crc kubenswrapper[4962]: E0220 09:56:18.138666 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.138314 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.138344 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.138315 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:18 crc kubenswrapper[4962]: E0220 09:56:18.138988 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:18 crc kubenswrapper[4962]: E0220 09:56:18.139185 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:18 crc kubenswrapper[4962]: E0220 09:56:18.139305 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.154361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.154429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.154449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.154476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.154497 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.257157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.257209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.257223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.257246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.257260 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.360734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.360820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.360847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.360885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.360911 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.463924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.463999 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.464024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.464059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.464081 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.567131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.567191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.567212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.567238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.567256 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.630028 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/0.log" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.630149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerStarted","Data":"330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.650983 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.665662 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.670429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.670458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.670466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.670485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.670495 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.680158 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.698977 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.712051 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.722404 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.734369 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.752554 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.765643 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.773130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.773169 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.773183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.773206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.773219 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.786678 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.814164 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.830184 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.843993 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.855565 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.867544 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.876129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.876174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.876188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.876211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.876224 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.885437 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.904107 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.916322 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.979955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.980002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.980015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.980033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.980043 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.082419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.082453 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.082461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.082475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.082487 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.102179 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 00:11:27.504331346 +0000 UTC Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.155835 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.170497 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.184751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.184793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.184802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.184820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.184829 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.188973 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.204362 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.221357 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.239861 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.255820 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.269621 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.281536 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.287284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.287328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.287386 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.287408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.287422 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.296623 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.311769 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.338762 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.370513 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.387374 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.390647 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.390872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.390966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.391686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.391779 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.403444 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.415694 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.430522 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.442326 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.494269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.494539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.494656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.494741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.494803 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.598138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.598193 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.598204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.598223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.598237 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.702101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.702492 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.702713 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.702892 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.703032 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.805787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.805838 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.805848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.805868 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.805880 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.909159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.909194 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.909206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.909226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.909237 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.011696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.011743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.011752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.011771 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.011785 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.102300 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 23:22:32.85477297 +0000 UTC Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.114419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.114501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.114530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.114570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.114636 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.143212 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.143297 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.143514 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:20 crc kubenswrapper[4962]: E0220 09:56:20.143503 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.143678 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:20 crc kubenswrapper[4962]: E0220 09:56:20.143756 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:20 crc kubenswrapper[4962]: E0220 09:56:20.144052 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:20 crc kubenswrapper[4962]: E0220 09:56:20.144315 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.217551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.217631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.217646 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.217670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.217684 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.320822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.320874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.320885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.320909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.320924 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.423472 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.423550 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.423571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.423631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.423653 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.525775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.525833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.525846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.525868 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.525881 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.627972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.628020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.628033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.628053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.628067 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.731506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.731550 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.731610 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.731634 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.731645 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.834341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.834371 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.834382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.834401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.834412 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.937082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.937153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.937163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.937178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.937187 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.039116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.039147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.039156 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.039173 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.039183 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.102834 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 09:48:00.542275573 +0000 UTC Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.142129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.142164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.142175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.142192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.142202 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.244832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.244915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.244933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.244965 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.244987 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.347826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.347886 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.347896 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.347917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.347932 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.450897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.450972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.450991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.451021 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.451040 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.554024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.554100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.554123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.554153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.554174 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.656709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.656765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.656778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.656799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.656811 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.759894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.759955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.759973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.759994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.760007 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.863415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.863487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.863505 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.863536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.863557 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.966973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.967055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.967072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.967099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.967118 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.070720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.070765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.070779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.070802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.070816 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.103859 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 21:52:05.407747362 +0000 UTC Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.138493 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:22 crc kubenswrapper[4962]: E0220 09:56:22.138704 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.138888 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.138926 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:22 crc kubenswrapper[4962]: E0220 09:56:22.138967 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.138971 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:22 crc kubenswrapper[4962]: E0220 09:56:22.139013 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:22 crc kubenswrapper[4962]: E0220 09:56:22.139087 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.172614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.172685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.172730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.172749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.172860 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.275366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.275398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.275413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.275433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.275445 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.378523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.378654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.378679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.378713 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.378733 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.481724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.481774 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.481787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.481809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.481823 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.584092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.584181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.584202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.584233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.584260 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.687078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.687124 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.687138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.687157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.687167 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.790146 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.790218 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.790239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.790270 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.790288 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.894067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.894142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.894161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.894188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.894207 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.000159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.000248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.000269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.000301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.000326 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.103987 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 12:21:56.156205683 +0000 UTC Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.104050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.104685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.104749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.104829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.104899 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.153046 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.207686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.207756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.207774 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.207801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.207821 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.311250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.311351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.311378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.311421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.311447 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.414033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.414095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.414112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.414185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.414204 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.516708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.516764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.516776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.516796 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.516809 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.618841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.618872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.618880 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.618895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.618906 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.721696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.721740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.721753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.721772 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.721784 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.825372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.825421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.825432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.825451 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.825463 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.928705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.928744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.928753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.928769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.928781 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.032341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.032404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.032421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.032444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.032458 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.105677 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 11:51:08.115396288 +0000 UTC Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.134968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.135069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.135095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.135133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.135184 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.138362 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.138438 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:24 crc kubenswrapper[4962]: E0220 09:56:24.138538 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:24 crc kubenswrapper[4962]: E0220 09:56:24.138682 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.139074 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:24 crc kubenswrapper[4962]: E0220 09:56:24.139273 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.139337 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:24 crc kubenswrapper[4962]: E0220 09:56:24.139426 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.238345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.238398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.238409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.238428 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.238441 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.342575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.342673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.342691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.342718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.342739 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.446242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.446314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.446335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.446364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.446384 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.549665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.549727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.549749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.549782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.549807 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.652190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.652256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.652280 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.652305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.652324 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.755291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.755360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.755383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.755414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.755435 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.858515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.858577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.858630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.858662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.858680 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.966697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.966781 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.966797 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.966822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.966838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.069405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.069471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.069490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.069518 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.069537 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.105916 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 00:21:49.696128383 +0000 UTC Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.172542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.172642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.172664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.172690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.172710 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.276511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.276583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.276650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.276685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.276708 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.379682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.379746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.379781 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.379811 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.379833 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.483428 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.483485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.483497 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.483522 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.483537 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.586850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.586915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.586931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.586955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.586972 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.690236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.690307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.690325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.690351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.690369 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.793455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.793535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.793554 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.793587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.793695 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.896955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.897014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.897033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.897059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.897081 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.000372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.000426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.000444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.000471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.000490 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.071316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.071385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.071405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.071436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.071456 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.100702 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106190 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 18:10:03.095582296 +0000 UTC Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106522 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106556 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.123101 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.127464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.127532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.127560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.127628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.127660 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.138511 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.138582 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.138715 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.138747 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.138909 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.138931 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.139621 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.139797 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.140516 4962 scope.go:117] "RemoveContainer" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.149644 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.155006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.155059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.155080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.155109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.155129 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.175773 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.183204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.183286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.183312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.183351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.183373 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.207478 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.207652 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.210283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.210320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.210334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.210355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.210370 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.314558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.314669 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.314695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.314725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.314751 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.417757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.417794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.417806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.417826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.417838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.521560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.521636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.521650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.521672 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.521684 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.624972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.625049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.625072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.625096 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.625114 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.661020 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/2.log" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.664894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.665979 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.692522 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.710808 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"096c4ebd-ac7b-45f6-abfa-5d54e4bce009\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa1dbe5648fbe736a165150e168243abc4486420bf78e560c86ec9cc6a608c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.728517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.728556 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.728567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.728586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.728618 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.729758 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.747475 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.764815 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.780545 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.794800 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.814907 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.830935 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.830979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.830989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.831006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.831016 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.838891 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.853333 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.868007 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.886490 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.899358 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.919239 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.933951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.934018 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.934034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.934058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.934074 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.938693 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.952935 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.964104 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.974105 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.983772 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.037180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.037247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.037259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.037280 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.037292 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.107051 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 04:21:47.202288435 +0000 UTC Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.140008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.140067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.140085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.140109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.140127 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.243508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.243581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.243641 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.243675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.243694 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.347304 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.347374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.347425 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.347456 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.347478 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.451285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.451369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.451432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.451466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.451487 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.555086 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.555153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.555178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.555209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.555232 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.658778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.658842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.658858 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.658882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.658902 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.671855 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/3.log" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.672921 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/2.log" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.676948 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" exitCode=1 Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.677017 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.677089 4962 scope.go:117] "RemoveContainer" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.678106 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 09:56:27 crc kubenswrapper[4962]: E0220 09:56:27.678381 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.699336 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.720587 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.746963 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.762684 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.762750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.762769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.762797 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.762820 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.780569 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:27Z\\\",\\\"message\\\":\\\"ce-ca-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00716257f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0220 09:56:27.136040 7017 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.815870 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.839801 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.857724 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.866449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.866520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.866546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.866577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.866633 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.879423 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.899214 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.921896 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.939586 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.954902 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.968499 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.970024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.970056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.970068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.970087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.970100 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.985473 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.000054 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"096c4ebd-ac7b-45f6-abfa-5d54e4bce009\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa1dbe5648fbe736a165150e168243abc4486420bf78e560c86ec9cc6a608c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.014012 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.037447 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.051710 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.066415 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.073040 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.073084 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.073098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.073120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.073133 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.107972 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 18:45:50.811630576 +0000 UTC Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.138649 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.138798 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.138813 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.138732 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:28 crc kubenswrapper[4962]: E0220 09:56:28.139059 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:28 crc kubenswrapper[4962]: E0220 09:56:28.139258 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:28 crc kubenswrapper[4962]: E0220 09:56:28.139452 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:28 crc kubenswrapper[4962]: E0220 09:56:28.139738 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.175378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.175461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.175480 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.175535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.175556 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.278809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.278860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.278882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.278908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.278925 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.381973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.382026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.382041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.382064 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.382079 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.484876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.484943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.484967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.485010 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.485030 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.587723 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.587801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.587826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.587855 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.587877 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.682522 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/3.log" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.686691 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 09:56:28 crc kubenswrapper[4962]: E0220 09:56:28.686944 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.690055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.690291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.690435 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.690725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.691113 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.712661 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.737580 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.753557 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.773764 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.788351 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"096c4ebd-ac7b-45f6-abfa-5d54e4bce009\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa1dbe5648fbe736a165150e168243abc4486420bf78e560c86ec9cc6a608c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.794280 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.794519 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.794951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.795166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.795331 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.804912 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.819780 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.839185 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.855544 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.873752 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.889811 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.898699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.898738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.898753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.898774 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.898787 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.908006 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.930934 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:27Z\\\",\\\"message\\\":\\\"ce-ca-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00716257f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0220 09:56:27.136040 7017 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.958686 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.974056 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.985090 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.998610 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.002423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.002498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.002512 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.002533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.002547 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.013192 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.026763 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.105366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.105457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.105477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.105510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.105529 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.108817 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:17:01.957953713 +0000 UTC Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.156153 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.176475 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.193573 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.209093 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.209341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.209468 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.209662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.209805 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.213338 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.235169 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.254838 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.287258 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:27Z\\\",\\\"message\\\":\\\"ce-ca-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00716257f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0220 09:56:27.136040 7017 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.308018 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.312490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.312539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.312565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.312620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.312657 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.321343 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.332893 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.345954 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.363367 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.374003 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"096c4ebd-ac7b-45f6-abfa-5d54e4bce009\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa1dbe5648fbe736a165150e168243abc4486420bf78e560c86ec9cc6a608c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.392928 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.414988 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.415627 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.415710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.415730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.415765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.415782 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.427556 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.443888 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.457791 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.471535 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.519287 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.519331 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.519342 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.519362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.519374 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.621446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.621506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.621731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.621759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.621779 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.724364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.724412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.724423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.724442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.724453 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.827430 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.827495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.827914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.827982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.828265 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.931101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.931143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.931152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.931180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.931190 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.033419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.033507 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.033527 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.033559 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.033618 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.109430 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 21:23:22.05692129 +0000 UTC Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.136271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.136312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.136322 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.136341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.136353 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.138688 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.138720 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.138739 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.138720 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:30 crc kubenswrapper[4962]: E0220 09:56:30.138794 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:30 crc kubenswrapper[4962]: E0220 09:56:30.138984 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:30 crc kubenswrapper[4962]: E0220 09:56:30.139146 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:30 crc kubenswrapper[4962]: E0220 09:56:30.139187 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.239186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.239267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.239316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.239341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.239358 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.342469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.342538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.342557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.342589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.342702 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.445923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.445985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.446007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.446037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.446059 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.549726 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.549795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.549818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.549850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.549873 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.653278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.653583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.653705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.653786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.653852 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.757112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.757273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.757306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.757337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.757360 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.860354 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.860416 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.860433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.860506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.860538 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.965129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.965190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.965204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.965229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.965245 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.068994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.069062 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.069080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.069106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.069127 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.110094 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 21:24:32.788895851 +0000 UTC Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.171182 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.171213 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.171221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.171233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.171242 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.274519 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.274579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.274621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.274645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.274666 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.377250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.377330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.377355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.377406 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.377432 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.481607 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.481660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.481674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.481695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.481710 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.585709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.585773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.585792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.585818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.585838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.688340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.688376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.688390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.688408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.688421 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.791267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.791327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.791352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.791395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.791421 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.894661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.894721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.894739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.894759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.894772 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.998320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.998368 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.998381 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.998400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.998413 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.045694 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.045837 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.045811087 +0000 UTC m=+147.628282943 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.046343 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.046537 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.046818 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.046560 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.047280 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.047257391 +0000 UTC m=+147.629729237 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.046685 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.047490 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.047478738 +0000 UTC m=+147.629950584 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.046914 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.047627 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.047641 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.047668 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.047660274 +0000 UTC m=+147.630132120 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.101545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.101655 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.101678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.101715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.101741 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.110994 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 08:58:52.429492638 +0000 UTC Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.138426 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.138450 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.138950 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.138533 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.139532 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.138484 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.139824 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.138942 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.148298 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.148557 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.148629 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.148648 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.148726 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.148704585 +0000 UTC m=+147.731176451 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.204950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.205032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.205051 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.205075 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.205092 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.307520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.307622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.307649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.307678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.307697 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.411736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.411823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.411850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.411884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.411908 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.515749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.515822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.515839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.515867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.515891 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.620667 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.620730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.620745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.620766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.620777 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.723894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.723957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.723976 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.723997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.724013 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.826908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.826965 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.826978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.826997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.827008 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.930019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.930082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.930101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.930133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.930157 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.034081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.034156 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.034174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.034199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.034217 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.111144 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 02:44:01.473678965 +0000 UTC Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.137924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.138014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.138037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.138091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.138108 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.243183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.243305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.243326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.243353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.243375 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.347691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.347834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.347857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.347919 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.347942 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.452090 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.452167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.452229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.452260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.452280 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.555949 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.556014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.556025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.556046 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.556060 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.659144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.659175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.659185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.659201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.659211 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.761864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.761891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.761900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.761914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.761923 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.864372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.864405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.864413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.864428 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.864438 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.967165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.967194 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.967204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.967218 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.967228 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.070255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.070319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.070341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.070374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.070398 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.111872 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 09:48:56.618395552 +0000 UTC Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.138498 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.138652 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.138661 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.138666 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:34 crc kubenswrapper[4962]: E0220 09:56:34.138854 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:34 crc kubenswrapper[4962]: E0220 09:56:34.139076 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:34 crc kubenswrapper[4962]: E0220 09:56:34.139171 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:34 crc kubenswrapper[4962]: E0220 09:56:34.139482 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.172911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.172958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.172976 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.173015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.173041 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.276789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.276856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.276874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.276902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.276922 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.380747 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.380812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.380826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.380849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.380862 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.483612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.483675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.483694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.483722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.483742 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.586580 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.586740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.586769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.586816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.586837 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.689754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.689824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.689843 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.689872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.689891 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.792728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.792791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.792802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.792836 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.792848 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.895302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.895364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.895399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.895421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.895440 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.998825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.998908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.998928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.998954 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.998975 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.102376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.102441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.102455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.102479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.102495 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.112772 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 06:17:49.663363892 +0000 UTC Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.205119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.205166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.205181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.205205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.205222 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.308573 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.309177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.309199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.309226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.309248 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.412553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.412697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.412724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.412758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.412785 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.519112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.519227 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.519259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.519292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.519310 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.622555 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.622807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.622837 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.622875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.622902 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.725390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.725459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.725477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.725504 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.725522 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.828501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.828560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.828581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.828643 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.828662 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.932617 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.932677 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.932689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.932713 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.932726 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.036081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.036151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.036176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.036209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.036236 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.113521 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 03:04:34.405194703 +0000 UTC Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.137934 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.138024 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.138121 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.138113 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.138358 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.138343 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.138434 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.138689 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.139704 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.139780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.139802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.139834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.139858 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.243508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.243577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.243625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.243654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.243680 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.326636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.326690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.326703 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.326725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.326735 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.341029 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.346714 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.346758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.346767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.346787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.346802 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.365777 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.370770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.370808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.370824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.370845 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.370860 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.390327 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.395578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.395685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.395709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.395741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.395759 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.412707 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.417581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.417634 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.417643 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.417661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.417671 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.432325 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.432445 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.434984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.435028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.435041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.435066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.435079 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.537943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.538017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.538032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.538052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.538070 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.641721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.641819 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.641848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.642072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.642096 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.745130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.745195 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.745213 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.745242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.745262 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.847961 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.848036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.848053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.848080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.848098 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.951055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.951091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.951099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.951114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.951125 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.054319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.054381 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.054399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.054424 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.054443 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.114258 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 00:13:16.314443481 +0000 UTC Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.157083 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.157142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.157159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.157183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.157200 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.260720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.260752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.260763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.260782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.260964 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.364693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.364771 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.364793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.364825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.364847 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.467706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.467748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.467759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.467776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.467787 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.571385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.571457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.571478 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.571506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.571527 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.675206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.675326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.675347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.675418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.675441 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.779580 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.779685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.779704 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.779734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.779758 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.882890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.882982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.883016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.883049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.883072 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.985931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.985985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.985995 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.986013 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.986025 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.089139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.089204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.089223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.089250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.089270 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.114871 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 07:02:21.068437752 +0000 UTC Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.138155 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.138279 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.138183 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:38 crc kubenswrapper[4962]: E0220 09:56:38.138430 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.138306 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:38 crc kubenswrapper[4962]: E0220 09:56:38.138512 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:38 crc kubenswrapper[4962]: E0220 09:56:38.138641 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:38 crc kubenswrapper[4962]: E0220 09:56:38.138784 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.192507 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.192584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.192667 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.192698 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.192722 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.296063 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.296120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.296140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.296170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.296189 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.399863 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.399929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.399946 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.399973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.399996 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.502854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.502931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.502951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.502985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.503011 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.607170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.607235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.607252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.607273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.607286 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.710631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.710732 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.710751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.710776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.710797 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.814345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.814426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.814444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.814474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.814503 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.918953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.919029 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.919047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.919077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.919097 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.022957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.023038 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.023067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.023103 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.023129 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.115086 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 16:15:16.386862125 +0000 UTC Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.127112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.127175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.127194 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.127224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.127246 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.166031 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.188145 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.210473 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.230844 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.230912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.230930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.230964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.230989 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.232453 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.257948 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.288102 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:27Z\\\",\\\"message\\\":\\\"ce-ca-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00716257f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0220 09:56:27.136040 7017 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.325863 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.334734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.334799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.334819 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.334848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.334867 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.351231 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.375193 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.397089 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.418262 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.438279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.438448 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.438475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.438564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.438660 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.442896 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.463810 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.482458 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.503170 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.526416 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.543650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.544171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.544366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.544632 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.544804 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.546043 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"096c4ebd-ac7b-45f6-abfa-5d54e4bce009\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa1dbe5648fbe736a165150e168243abc4486420bf78e560c86ec9cc6a608c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.570897 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.626526 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.647764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.648248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.648363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.648503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.648646 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.751494 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.751570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.751590 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.751645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.751666 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.855629 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.855692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.855711 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.855738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.855758 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.959022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.959077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.959098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.959128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.959151 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.062776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.062872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.062916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.062950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.062976 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.115811 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 00:54:23.86024088 +0000 UTC Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.138202 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.138263 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:40 crc kubenswrapper[4962]: E0220 09:56:40.138387 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.138426 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:40 crc kubenswrapper[4962]: E0220 09:56:40.138664 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.138926 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:40 crc kubenswrapper[4962]: E0220 09:56:40.139175 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:40 crc kubenswrapper[4962]: E0220 09:56:40.139489 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.166021 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.166089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.166112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.166138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.166158 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.273105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.273201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.273229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.273269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.273307 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.378933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.379026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.379053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.379092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.379119 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.483152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.483242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.483263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.483293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.483316 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.586701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.586760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.586780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.586807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.586828 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.690361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.690434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.690457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.690490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.690513 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.794149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.794228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.794240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.794277 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.794291 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.897793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.897856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.897874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.897901 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.897921 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.001155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.001264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.001292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.001327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.001356 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.104908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.104983 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.105003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.105032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.105052 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.116366 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 01:30:47.69192122 +0000 UTC Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.208467 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.208544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.208558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.208584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.208621 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.312320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.312390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.312405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.312429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.312444 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.415656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.415760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.415788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.415822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.415847 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.519370 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.519476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.519490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.519509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.519526 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.622272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.622325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.622343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.622369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.622389 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.725165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.725243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.725264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.725296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.725315 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.829341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.829423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.829449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.829485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.829509 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.933665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.933721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.933740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.933768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.933793 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.036701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.036768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.036803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.036842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.036866 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.116868 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 16:05:39.426782801 +0000 UTC Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.138245 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.138271 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.138245 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:42 crc kubenswrapper[4962]: E0220 09:56:42.138427 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.138761 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:42 crc kubenswrapper[4962]: E0220 09:56:42.138799 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:42 crc kubenswrapper[4962]: E0220 09:56:42.138877 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:42 crc kubenswrapper[4962]: E0220 09:56:42.139014 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.140284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.140315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.140330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.140347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.140362 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.243135 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.243225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.243260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.243290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.243317 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.346509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.346570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.346587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.346676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.346702 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.450740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.450813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.450837 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.450867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.450891 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.558906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.559044 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.559122 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.559216 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.559250 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.662766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.662833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.662850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.662874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.662892 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.766251 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.766339 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.766359 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.766405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.766424 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.870191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.870249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.870273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.870299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.870320 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.980736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.980782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.980793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.980816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.980826 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.085499 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.085551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.085562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.085583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.085619 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.117872 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 19:22:36.739328911 +0000 UTC Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.140557 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 09:56:43 crc kubenswrapper[4962]: E0220 09:56:43.140756 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.188629 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.188793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.188861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.188890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.188909 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.292802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.292874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.292894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.292927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.292952 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.396655 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.396711 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.396728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.396753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.396772 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.499973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.500031 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.500048 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.500069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.500086 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.603721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.604348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.604539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.604755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.604904 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.708155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.708217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.708235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.708260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.708277 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.811181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.811268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.811298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.811335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.811362 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.915123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.915553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.915763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.915944 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.916091 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.020736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.020796 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.020813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.020840 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.020858 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.118303 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 19:42:01.494335449 +0000 UTC Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.125368 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.125422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.125445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.125477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.125529 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.138853 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.138928 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:44 crc kubenswrapper[4962]: E0220 09:56:44.139029 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.139027 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.138867 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:44 crc kubenswrapper[4962]: E0220 09:56:44.139238 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:44 crc kubenswrapper[4962]: E0220 09:56:44.139322 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:44 crc kubenswrapper[4962]: E0220 09:56:44.139437 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.228978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.229416 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.229559 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.229760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.229906 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.333748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.333808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.333828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.333855 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.333875 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.437678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.437751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.437774 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.437804 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.437824 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.541885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.541953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.541972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.542005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.542026 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.645815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.645899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.645923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.645956 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.645979 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.749706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.749750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.749761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.749780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.749789 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.853689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.853792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.853813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.853842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.853859 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.957940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.957989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.958003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.958032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.958044 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.061874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.061958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.061980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.062015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.062040 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.119177 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 12:33:37.995070001 +0000 UTC Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.165644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.166052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.166217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.166383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.166529 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.269461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.269547 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.269569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.269650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.269677 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.373649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.373712 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.373733 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.373761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.373781 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.477234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.477323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.477350 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.477383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.477405 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.580873 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.580947 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.580968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.580996 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.581015 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.684544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.684653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.684673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.684708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.684732 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.788427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.788475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.788489 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.788509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.788521 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.893089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.893146 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.893165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.893213 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.893233 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.998384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.998467 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.998484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.998509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.998528 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.102663 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.102735 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.102757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.102787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.102807 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.119326 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 01:59:11.572428717 +0000 UTC Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.138133 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.138176 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.138218 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.138956 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.139298 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.139409 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.139674 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.140000 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.206688 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.206746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.206764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.206790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.206811 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.309731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.309809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.309831 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.309860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.309882 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.413721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.413791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.413809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.413839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.413865 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.458064 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.458177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.458200 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.458233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.458260 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.482826 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:46Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.490457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.490525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.490545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.490574 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.490642 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.513673 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:46Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.520316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.520395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.520418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.520450 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.520469 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.542638 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:46Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.548159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.548226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.548253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.548286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.548310 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.570069 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:46Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.577049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.577109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.577127 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.577152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.577171 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.601294 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:46Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.601636 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.604485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.604562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.604630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.604665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.604691 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.708196 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.708262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.708289 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.708317 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.708339 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.811226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.811298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.811318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.811341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.811358 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.914674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.914741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.914763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.914789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.914810 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.017693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.017750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.017769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.017791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.017811 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.119528 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 12:33:48.305472184 +0000 UTC Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.120387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.120451 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.120475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.120498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.120520 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.224234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.224334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.224351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.224379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.224398 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.327054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.327144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.327163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.327192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.327214 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.430281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.430354 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.430377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.430405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.430425 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.533305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.533386 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.533410 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.533442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.533463 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.636405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.636443 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.636455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.636474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.636485 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751227 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:47 crc kubenswrapper[4962]: E0220 09:56:47.751542 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:56:47 crc kubenswrapper[4962]: E0220 09:56:47.751667 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:57:51.751630788 +0000 UTC m=+163.334102674 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751663 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751801 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.854469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.854512 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.854525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.854545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.854556 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.957405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.957484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.957510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.957543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.957565 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.060909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.060972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.060991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.061017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.061036 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.120417 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 01:21:49.637571447 +0000 UTC Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.138827 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.138876 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.138983 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.138833 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:48 crc kubenswrapper[4962]: E0220 09:56:48.139018 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:48 crc kubenswrapper[4962]: E0220 09:56:48.139095 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:48 crc kubenswrapper[4962]: E0220 09:56:48.139173 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:48 crc kubenswrapper[4962]: E0220 09:56:48.139279 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.164275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.164307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.164319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.164336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.164352 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.266622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.266673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.266689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.266711 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.266725 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.369969 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.370027 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.370038 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.370058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.370072 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.472795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.472866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.472890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.473016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.473099 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.577100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.577171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.577190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.577222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.577240 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.680664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.680721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.680740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.680761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.680774 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.783644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.783702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.783720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.783742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.783757 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.887884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.887949 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.887964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.887994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.888011 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.991101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.991133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.991143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.991159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.991170 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.094067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.094113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.094132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.094153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.094167 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.121658 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 13:49:55.608593091 +0000 UTC Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.158367 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.171017 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.197406 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.197462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.197477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.197496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.197511 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.207635 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.207619855 podStartE2EDuration="1m15.207619855s" podCreationTimestamp="2026-02-20 09:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.206065827 +0000 UTC m=+100.788537693" watchObservedRunningTime="2026-02-20 09:56:49.207619855 +0000 UTC m=+100.790091701" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.223617 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=26.223585945 podStartE2EDuration="26.223585945s" podCreationTimestamp="2026-02-20 09:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.223331437 +0000 UTC m=+100.805803283" watchObservedRunningTime="2026-02-20 09:56:49.223585945 +0000 UTC m=+100.806057791" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.282027 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=50.281998697 podStartE2EDuration="50.281998697s" podCreationTimestamp="2026-02-20 09:55:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.265873643 +0000 UTC m=+100.848345499" watchObservedRunningTime="2026-02-20 09:56:49.281998697 +0000 UTC m=+100.864470543" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.301078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.301123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.301138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.301158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.301173 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.369562 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wqwgj" podStartSLOduration=80.369544894 podStartE2EDuration="1m20.369544894s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.347031764 +0000 UTC m=+100.929503610" watchObservedRunningTime="2026-02-20 09:56:49.369544894 +0000 UTC m=+100.952016740" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.394669 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=77.394646625 podStartE2EDuration="1m17.394646625s" podCreationTimestamp="2026-02-20 09:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.393457808 +0000 UTC m=+100.975929654" watchObservedRunningTime="2026-02-20 09:56:49.394646625 +0000 UTC m=+100.977118471" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.403640 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.403693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.403710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.403731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.403744 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.412361 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.412341437 podStartE2EDuration="1m21.412341437s" podCreationTimestamp="2026-02-20 09:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.411520733 +0000 UTC m=+100.993992579" watchObservedRunningTime="2026-02-20 09:56:49.412341437 +0000 UTC m=+100.994813283" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.442762 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" podStartSLOduration=79.442737821 podStartE2EDuration="1m19.442737821s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.441286836 +0000 UTC m=+101.023758702" watchObservedRunningTime="2026-02-20 09:56:49.442737821 +0000 UTC m=+101.025209687" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.470860 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s8xxr" podStartSLOduration=80.470835502 podStartE2EDuration="1m20.470835502s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.470483472 +0000 UTC m=+101.052955318" watchObservedRunningTime="2026-02-20 09:56:49.470835502 +0000 UTC m=+101.053307358" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.482220 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hxb97" podStartSLOduration=80.482201181 podStartE2EDuration="1m20.482201181s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.481341416 +0000 UTC m=+101.063813262" watchObservedRunningTime="2026-02-20 09:56:49.482201181 +0000 UTC m=+101.064673027" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.506346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.506387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.506397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.506414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.506425 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.608690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.608729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.608738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.608754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.608767 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.711060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.711097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.711111 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.711129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.711143 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.813496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.814001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.814185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.814346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.814497 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.917725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.917769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.917779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.917800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.917813 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.200038 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.200032 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 18:55:41.369249611 +0000 UTC Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.200157 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:50 crc kubenswrapper[4962]: E0220 09:56:50.200392 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:50 crc kubenswrapper[4962]: E0220 09:56:50.200475 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.200703 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:50 crc kubenswrapper[4962]: E0220 09:56:50.200834 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.201550 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:50 crc kubenswrapper[4962]: E0220 09:56:50.201917 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.202517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.202550 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.202560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.202581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.202608 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.306691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.306767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.306788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.306834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.306856 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.410245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.410353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.410373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.410400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.410424 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.513715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.513771 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.513781 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.513822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.513838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.617271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.617352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.617367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.617388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.617399 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.720305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.720392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.720417 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.720447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.720470 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.825457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.825549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.825636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.825677 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.825702 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.928878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.928970 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.928993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.929025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.929054 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.031457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.031511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.031520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.031538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.031548 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.134109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.134154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.134166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.134186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.134199 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.201254 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 02:11:05.588673063 +0000 UTC Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.236339 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.236405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.236429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.236460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.236490 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.339917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.339974 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.339992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.340013 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.340026 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.443334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.443392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.443405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.443426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.443439 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.546671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.546743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.546755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.546799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.546811 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.649979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.650024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.650037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.650058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.650073 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.752873 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.752915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.752924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.752939 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.752947 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.856254 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.856307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.856321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.856344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.856359 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.958951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.959006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.959020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.959041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.959057 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.061739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.061798 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.061823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.061854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.061875 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.138796 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.138844 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.138917 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.138804 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:52 crc kubenswrapper[4962]: E0220 09:56:52.139023 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:52 crc kubenswrapper[4962]: E0220 09:56:52.139137 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:52 crc kubenswrapper[4962]: E0220 09:56:52.139282 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:52 crc kubenswrapper[4962]: E0220 09:56:52.139395 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.164452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.164491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.164501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.164516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.164526 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.202026 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 08:54:31.973931217 +0000 UTC Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.268527 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.268640 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.268662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.268694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.268715 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.372034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.372113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.372134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.372163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.372183 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.476659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.476736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.476791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.476820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.476838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.585315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.585931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.586413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.586459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.586488 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.691455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.691546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.691573 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.691648 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.691676 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.796165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.796247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.796271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.796317 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.796339 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.899969 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.900049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.900068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.900095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.900114 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.003659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.003719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.003736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.003757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.003774 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.107670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.107761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.107782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.107812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.107833 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.203030 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 01:31:12.55139107 +0000 UTC Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.210752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.210811 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.210828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.210852 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.210864 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.314822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.314893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.314911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.314940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.314958 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.419543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.419645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.419675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.419707 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.419729 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.522766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.522857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.522877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.522908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.522932 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.627332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.627401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.627419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.627446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.627464 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.731923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.732002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.732025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.732053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.732078 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.835611 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.835661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.835671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.835688 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.835699 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.939095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.939171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.939190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.939222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.939246 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.042059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.042114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.042129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.042148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.042163 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.138353 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.138472 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.138354 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.138424 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:54 crc kubenswrapper[4962]: E0220 09:56:54.138782 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:54 crc kubenswrapper[4962]: E0220 09:56:54.138962 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:54 crc kubenswrapper[4962]: E0220 09:56:54.139155 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:54 crc kubenswrapper[4962]: E0220 09:56:54.139320 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.145558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.145641 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.145658 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.145685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.145710 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.204059 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 00:17:40.592174691 +0000 UTC Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.248532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.248623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.248643 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.248668 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.248688 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.350916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.350999 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.351023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.351056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.351074 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.453989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.454044 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.454056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.454078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.454090 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.556332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.556387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.556402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.556427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.556444 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.658561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.658615 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.658627 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.658644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.658655 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.761880 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.761918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.761928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.761945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.761955 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.864253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.864446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.864459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.864476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.864487 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.967187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.967233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.967242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.967281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.967291 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.069628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.069670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.069682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.069701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.069713 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.139170 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 09:56:55 crc kubenswrapper[4962]: E0220 09:56:55.139400 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.172485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.172536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.172549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.172568 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.172581 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.204779 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 22:11:40.400795172 +0000 UTC Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.274824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.274857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.274865 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.274883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.274896 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.377258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.377298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.377309 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.377334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.377347 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.479877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.479914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.479923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.479939 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.479950 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.582570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.582614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.582623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.582638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.582648 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.684968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.685012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.685024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.685044 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.685057 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.787803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.787842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.787857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.787878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.787911 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.890197 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.890244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.890258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.890275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.890287 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.993089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.993144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.993152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.993184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.993195 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.096875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.096925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.096938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.096962 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.096978 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.138339 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.138377 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.138415 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.138538 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:56 crc kubenswrapper[4962]: E0220 09:56:56.138716 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:56 crc kubenswrapper[4962]: E0220 09:56:56.138951 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:56 crc kubenswrapper[4962]: E0220 09:56:56.139055 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:56 crc kubenswrapper[4962]: E0220 09:56:56.138993 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.200130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.200187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.200202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.200225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.200240 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.205413 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 00:07:53.081491484 +0000 UTC Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.303904 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.303980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.303998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.304023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.304042 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.406829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.406876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.406889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.406905 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.406916 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.509329 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.509375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.509387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.509404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.509414 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.611908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.611952 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.611966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.611986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.612001 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.715120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.715172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.715183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.715203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.715215 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.819998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.820041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.820054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.820070 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.820080 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.922584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.922671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.922687 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.922707 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.922720 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.958484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.958571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.958622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.958651 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.958671 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.013991 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz"] Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.014566 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.018012 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.018234 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.018261 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.018324 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.033816 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" podStartSLOduration=88.033667606 podStartE2EDuration="1m28.033667606s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:57.032518901 +0000 UTC m=+108.614990747" watchObservedRunningTime="2026-02-20 09:56:57.033667606 +0000 UTC m=+108.616139452" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.048663 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podStartSLOduration=88.048632646 podStartE2EDuration="1m28.048632646s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:57.046950814 +0000 UTC m=+108.629422700" watchObservedRunningTime="2026-02-20 09:56:57.048632646 +0000 UTC m=+108.631104532" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.082922 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.082985 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5229176-0c7f-4323-87a3-b9a848df3af0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.083023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5229176-0c7f-4323-87a3-b9a848df3af0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.083123 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.083146 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5229176-0c7f-4323-87a3-b9a848df3af0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184236 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184282 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5229176-0c7f-4323-87a3-b9a848df3af0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184301 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5229176-0c7f-4323-87a3-b9a848df3af0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184335 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184352 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5229176-0c7f-4323-87a3-b9a848df3af0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184390 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.190605 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5229176-0c7f-4323-87a3-b9a848df3af0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.191477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5229176-0c7f-4323-87a3-b9a848df3af0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.205533 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 00:25:48.983256009 +0000 UTC Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.205730 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.206024 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5229176-0c7f-4323-87a3-b9a848df3af0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.215019 4962 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.337708 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.789513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" event={"ID":"c5229176-0c7f-4323-87a3-b9a848df3af0","Type":"ContainerStarted","Data":"415608f9120a0f246bb97bd6289d0fa63e5ee629011761737cd2277a75e86e19"} Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.789560 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" event={"ID":"c5229176-0c7f-4323-87a3-b9a848df3af0","Type":"ContainerStarted","Data":"0601267d1dadf5d213a406f50a286f14eef4de193b7dc1765e78527715c629af"} Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.809995 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" podStartSLOduration=88.809970392 podStartE2EDuration="1m28.809970392s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:57.808203247 +0000 UTC m=+109.390675103" watchObservedRunningTime="2026-02-20 09:56:57.809970392 +0000 UTC m=+109.392442248" Feb 20 09:56:58 crc kubenswrapper[4962]: I0220 09:56:58.138043 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:58 crc kubenswrapper[4962]: I0220 09:56:58.138162 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:58 crc kubenswrapper[4962]: E0220 09:56:58.138189 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:58 crc kubenswrapper[4962]: I0220 09:56:58.138246 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:58 crc kubenswrapper[4962]: I0220 09:56:58.138293 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:58 crc kubenswrapper[4962]: E0220 09:56:58.138474 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:58 crc kubenswrapper[4962]: E0220 09:56:58.138682 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:58 crc kubenswrapper[4962]: E0220 09:56:58.138874 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:00 crc kubenswrapper[4962]: I0220 09:57:00.138244 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:00 crc kubenswrapper[4962]: I0220 09:57:00.138278 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:00 crc kubenswrapper[4962]: I0220 09:57:00.138335 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:00 crc kubenswrapper[4962]: I0220 09:57:00.138641 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:00 crc kubenswrapper[4962]: E0220 09:57:00.138884 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:00 crc kubenswrapper[4962]: E0220 09:57:00.139001 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:00 crc kubenswrapper[4962]: E0220 09:57:00.139103 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:00 crc kubenswrapper[4962]: E0220 09:57:00.139214 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:02 crc kubenswrapper[4962]: I0220 09:57:02.138425 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:02 crc kubenswrapper[4962]: I0220 09:57:02.138496 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:02 crc kubenswrapper[4962]: I0220 09:57:02.138501 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:02 crc kubenswrapper[4962]: E0220 09:57:02.138635 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:02 crc kubenswrapper[4962]: I0220 09:57:02.138673 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:02 crc kubenswrapper[4962]: E0220 09:57:02.138844 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:02 crc kubenswrapper[4962]: E0220 09:57:02.138962 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:02 crc kubenswrapper[4962]: E0220 09:57:02.139117 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.811309 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/1.log" Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.811854 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/0.log" Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.811907 4962 generic.go:334] "Generic (PLEG): container finished" podID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" containerID="330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67" exitCode=1 Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.811960 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerDied","Data":"330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67"} Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.812009 4962 scope.go:117] "RemoveContainer" containerID="e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661" Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.812940 4962 scope.go:117] "RemoveContainer" containerID="330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67" Feb 20 09:57:03 crc kubenswrapper[4962]: E0220 09:57:03.813345 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wqwgj_openshift-multus(1957ac70-30f9-48c2-a82b-72aa3b7a883a)\"" pod="openshift-multus/multus-wqwgj" podUID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" Feb 20 09:57:04 crc kubenswrapper[4962]: I0220 09:57:04.138078 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:04 crc kubenswrapper[4962]: I0220 09:57:04.138124 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:04 crc kubenswrapper[4962]: E0220 09:57:04.138207 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:04 crc kubenswrapper[4962]: I0220 09:57:04.138364 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:04 crc kubenswrapper[4962]: I0220 09:57:04.138364 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:04 crc kubenswrapper[4962]: E0220 09:57:04.138523 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:04 crc kubenswrapper[4962]: E0220 09:57:04.138830 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:04 crc kubenswrapper[4962]: E0220 09:57:04.138920 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:04 crc kubenswrapper[4962]: I0220 09:57:04.818733 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/1.log" Feb 20 09:57:06 crc kubenswrapper[4962]: I0220 09:57:06.138211 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:06 crc kubenswrapper[4962]: E0220 09:57:06.138667 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:06 crc kubenswrapper[4962]: I0220 09:57:06.138297 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:06 crc kubenswrapper[4962]: E0220 09:57:06.138763 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:06 crc kubenswrapper[4962]: I0220 09:57:06.138326 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:06 crc kubenswrapper[4962]: I0220 09:57:06.138247 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:06 crc kubenswrapper[4962]: E0220 09:57:06.138890 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:06 crc kubenswrapper[4962]: E0220 09:57:06.138973 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.138709 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.138784 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.138854 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:08 crc kubenswrapper[4962]: E0220 09:57:08.139073 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.139151 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:08 crc kubenswrapper[4962]: E0220 09:57:08.139311 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:08 crc kubenswrapper[4962]: E0220 09:57:08.139496 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:08 crc kubenswrapper[4962]: E0220 09:57:08.139655 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.140260 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.834920 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/3.log" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.837957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.838658 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.885229 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podStartSLOduration=99.885204429 podStartE2EDuration="1m39.885204429s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:08.884661012 +0000 UTC m=+120.467132898" watchObservedRunningTime="2026-02-20 09:57:08.885204429 +0000 UTC m=+120.467676295" Feb 20 09:57:09 crc kubenswrapper[4962]: I0220 09:57:09.025696 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5bwk2"] Feb 20 09:57:09 crc kubenswrapper[4962]: I0220 09:57:09.025837 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:09 crc kubenswrapper[4962]: E0220 09:57:09.025960 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:09 crc kubenswrapper[4962]: E0220 09:57:09.106804 4962 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 20 09:57:09 crc kubenswrapper[4962]: E0220 09:57:09.230172 4962 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 09:57:10 crc kubenswrapper[4962]: I0220 09:57:10.138285 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:10 crc kubenswrapper[4962]: I0220 09:57:10.138355 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:10 crc kubenswrapper[4962]: I0220 09:57:10.138356 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:10 crc kubenswrapper[4962]: E0220 09:57:10.138455 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:10 crc kubenswrapper[4962]: E0220 09:57:10.138583 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:10 crc kubenswrapper[4962]: E0220 09:57:10.138734 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:11 crc kubenswrapper[4962]: I0220 09:57:11.137969 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:11 crc kubenswrapper[4962]: E0220 09:57:11.138177 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:12 crc kubenswrapper[4962]: I0220 09:57:12.138125 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:12 crc kubenswrapper[4962]: I0220 09:57:12.138271 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:12 crc kubenswrapper[4962]: I0220 09:57:12.138412 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:12 crc kubenswrapper[4962]: E0220 09:57:12.138583 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:12 crc kubenswrapper[4962]: E0220 09:57:12.138741 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:12 crc kubenswrapper[4962]: E0220 09:57:12.138809 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:13 crc kubenswrapper[4962]: I0220 09:57:13.138183 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:13 crc kubenswrapper[4962]: E0220 09:57:13.138386 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:14 crc kubenswrapper[4962]: I0220 09:57:14.137890 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:14 crc kubenswrapper[4962]: E0220 09:57:14.138085 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:14 crc kubenswrapper[4962]: I0220 09:57:14.138203 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:14 crc kubenswrapper[4962]: E0220 09:57:14.138294 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:14 crc kubenswrapper[4962]: I0220 09:57:14.138365 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:14 crc kubenswrapper[4962]: E0220 09:57:14.138441 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:14 crc kubenswrapper[4962]: E0220 09:57:14.232523 4962 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 09:57:15 crc kubenswrapper[4962]: I0220 09:57:15.138321 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:15 crc kubenswrapper[4962]: E0220 09:57:15.138536 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.138987 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.139196 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.139203 4962 scope.go:117] "RemoveContainer" containerID="330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.139042 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:16 crc kubenswrapper[4962]: E0220 09:57:16.139355 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:16 crc kubenswrapper[4962]: E0220 09:57:16.141124 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:16 crc kubenswrapper[4962]: E0220 09:57:16.141399 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.870956 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/1.log" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.871437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerStarted","Data":"1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef"} Feb 20 09:57:17 crc kubenswrapper[4962]: I0220 09:57:17.138153 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:17 crc kubenswrapper[4962]: E0220 09:57:17.138329 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:17 crc kubenswrapper[4962]: I0220 09:57:17.432589 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:57:18 crc kubenswrapper[4962]: I0220 09:57:18.138414 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:18 crc kubenswrapper[4962]: E0220 09:57:18.138579 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:18 crc kubenswrapper[4962]: I0220 09:57:18.138576 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:18 crc kubenswrapper[4962]: I0220 09:57:18.138694 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:18 crc kubenswrapper[4962]: E0220 09:57:18.138941 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:18 crc kubenswrapper[4962]: E0220 09:57:18.139029 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:19 crc kubenswrapper[4962]: I0220 09:57:19.138100 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:19 crc kubenswrapper[4962]: E0220 09:57:19.139121 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.138740 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.138794 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.138979 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.142482 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.143881 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.144040 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.144201 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 20 09:57:21 crc kubenswrapper[4962]: I0220 09:57:21.138665 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:21 crc kubenswrapper[4962]: I0220 09:57:21.142783 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 20 09:57:21 crc kubenswrapper[4962]: I0220 09:57:21.142805 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.462701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.525413 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.526397 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.528984 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jtftl"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.532913 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.533770 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.536939 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.537133 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547088 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547325 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547406 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547479 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547942 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547957 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548461 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548672 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548791 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit-dir\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548807 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548859 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-serving-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548919 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-image-import-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548958 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-serving-cert\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549013 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-node-pullsecrets\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549069 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549102 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549132 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549173 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-client\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549204 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-encryption-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549239 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79w4l\" (UniqueName: \"kubernetes.io/projected/37e7b911-da73-4f82-ad0c-d8707547b7a7-kube-api-access-79w4l\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549499 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.550382 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.551245 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.560326 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.561126 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.561256 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.561718 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-k85np"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.562044 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tp9zq"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.562219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.562583 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.562230 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.565672 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.566539 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.566968 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.567727 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.567756 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.567874 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.568943 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.569025 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.569073 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.577086 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.578269 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.579131 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.579654 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.580423 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.581494 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.581823 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.583667 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-tv8j9"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.584380 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.584665 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5lf26"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.585225 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.587653 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.588101 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.588536 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.589204 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.590776 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.591360 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.593853 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.593906 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.594369 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.594581 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.594857 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.595058 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.595375 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.595516 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.598518 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.599304 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.599450 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.599960 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600013 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600113 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600155 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600435 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600641 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600803 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600821 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600954 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601035 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601096 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601159 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601246 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601336 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601420 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601531 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.612077 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.612689 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.599961 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.618439 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.620711 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.621363 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.621569 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.621882 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.622187 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.622461 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.623320 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.625929 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.635099 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.635311 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.635615 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.635739 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.635865 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.636368 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.636509 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.636636 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.636895 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.637008 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.637183 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wndb7"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.638440 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.638745 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.638952 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.639134 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.639440 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.639665 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.640025 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.640259 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.640476 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.644066 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.645493 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.645840 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.646415 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.646513 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.646563 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.646690 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.646966 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647036 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647119 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647228 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647293 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647318 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647427 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647431 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647464 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647577 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647683 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647708 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647987 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.649481 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.649687 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.649904 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.650383 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651368 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651407 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b55a13cf-03c6-46d9-b286-960a839b1558-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-node-pullsecrets\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651491 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-serving-cert\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651513 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfswd\" (UniqueName: \"kubernetes.io/projected/474b1e5d-9a6f-4931-be66-8fb20c82ac60-kube-api-access-nfswd\") pod \"migrator-59844c95c7-dc74p\" (UID: \"474b1e5d-9a6f-4931-be66-8fb20c82ac60\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651544 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651562 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.652899 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.662965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-node-pullsecrets\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.663713 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.664016 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.665436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.665901 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.666170 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.666273 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.669908 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98r7\" (UniqueName: \"kubernetes.io/projected/34f42578-fcc9-4539-add3-bca8deb6927b-kube-api-access-s98r7\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.669963 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34f42578-fcc9-4539-add3-bca8deb6927b-metrics-tls\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.674875 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677211 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79w4l\" (UniqueName: \"kubernetes.io/projected/37e7b911-da73-4f82-ad0c-d8707547b7a7-kube-api-access-79w4l\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677261 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-config\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677645 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677777 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5sb4\" (UniqueName: \"kubernetes.io/projected/77ff4d6a-8c1e-440f-a78c-900c09587848-kube-api-access-s5sb4\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677808 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677827 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrg46\" (UniqueName: \"kubernetes.io/projected/75c3ba8d-4548-4407-9188-a785ef05da2c-kube-api-access-lrg46\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677949 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-encryption-config\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677978 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit-dir\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678094 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678115 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c9lr\" (UniqueName: \"kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678225 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678304 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678454 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678485 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-trusted-ca\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678716 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dpx6\" (UniqueName: \"kubernetes.io/projected/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-kube-api-access-8dpx6\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678741 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678783 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678880 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678902 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678923 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljp6k\" (UniqueName: \"kubernetes.io/projected/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-kube-api-access-ljp6k\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.679068 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-image-import-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.679773 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.679955 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-proxy-tls\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680026 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/225d1d1d-8168-4489-af91-6a87f28c39ed-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680060 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-serving-cert\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680081 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-client\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680098 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d962f6fe-d955-483d-b149-976a11dd4922-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680129 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdt5\" (UniqueName: \"kubernetes.io/projected/cba11394-4e55-4edc-beec-750bddabc1d0-kube-api-access-2cdt5\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680149 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680166 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/225d1d1d-8168-4489-af91-6a87f28c39ed-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680203 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-images\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680228 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-945rx\" (UniqueName: \"kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680245 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680271 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680287 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-client\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680333 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-encryption-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/77ff4d6a-8c1e-440f-a78c-900c09587848-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680464 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cba11394-4e55-4edc-beec-750bddabc1d0-audit-dir\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680481 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225d1d1d-8168-4489-af91-6a87f28c39ed-config\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680498 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-audit-policies\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680514 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680567 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljckj\" (UniqueName: \"kubernetes.io/projected/b55a13cf-03c6-46d9-b286-960a839b1558-kube-api-access-ljckj\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680583 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680642 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d962f6fe-d955-483d-b149-976a11dd4922-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.681161 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.681370 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.682628 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98nf\" (UniqueName: \"kubernetes.io/projected/7ef2d9f9-34f2-48a6-83eb-689c0fdcac66-kube-api-access-x98nf\") pod \"downloads-7954f5f757-tv8j9\" (UID: \"7ef2d9f9-34f2-48a6-83eb-689c0fdcac66\") " pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.682783 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.682808 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.682826 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-serving-cert\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.686637 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-image-import-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677214 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tcwqj"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.687388 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.687781 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.687805 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ckmh2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.685565 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit-dir\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.689213 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.689678 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.684056 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.685018 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.690728 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.691795 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.692637 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.693168 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.693959 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75c3ba8d-4548-4407-9188-a785ef05da2c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.693996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-serving-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.694479 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smr5m\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-kube-api-access-smr5m\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.694644 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.694725 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.695739 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-serving-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.701696 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-client\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.701733 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.702264 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.702683 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.704454 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.705145 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.707022 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.707050 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.707086 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.707757 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.707847 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.713473 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.715259 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.715833 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.716679 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-serving-cert\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.722191 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-encryption-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.726518 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.727184 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.727749 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.728360 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.730877 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.732203 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.736361 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.742100 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.744673 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.750332 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.752045 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tv8j9"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.752911 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jtftl"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.753610 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.759051 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.760028 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.769424 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.772013 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7q8sx"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.773027 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.774367 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.775152 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.776465 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.777610 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.778522 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.780099 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lbvml"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.780714 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.782828 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.783992 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tp9zq"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.785031 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.788245 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4mw9f"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.790620 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.792543 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.792616 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.792756 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.793292 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7nh4t"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.794558 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-l92fq"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.795153 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.795217 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796060 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75c3ba8d-4548-4407-9188-a785ef05da2c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796684 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796714 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796742 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-serving-cert\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796772 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smr5m\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-kube-api-access-smr5m\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796845 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796873 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796899 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796925 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfswd\" (UniqueName: \"kubernetes.io/projected/474b1e5d-9a6f-4931-be66-8fb20c82ac60-kube-api-access-nfswd\") pod \"migrator-59844c95c7-dc74p\" (UID: \"474b1e5d-9a6f-4931-be66-8fb20c82ac60\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796951 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b55a13cf-03c6-46d9-b286-960a839b1558-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796974 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-serving-cert\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797017 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797079 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797104 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98r7\" (UniqueName: \"kubernetes.io/projected/34f42578-fcc9-4539-add3-bca8deb6927b-kube-api-access-s98r7\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797131 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797160 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34f42578-fcc9-4539-add3-bca8deb6927b-metrics-tls\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797200 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-config\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797227 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5sb4\" (UniqueName: \"kubernetes.io/projected/77ff4d6a-8c1e-440f-a78c-900c09587848-kube-api-access-s5sb4\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797255 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797279 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797299 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrg46\" (UniqueName: \"kubernetes.io/projected/75c3ba8d-4548-4407-9188-a785ef05da2c-kube-api-access-lrg46\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797319 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-encryption-config\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797349 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797372 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c9lr\" (UniqueName: \"kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797396 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dpx6\" (UniqueName: \"kubernetes.io/projected/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-kube-api-access-8dpx6\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797431 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797457 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797484 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-trusted-ca\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797506 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797535 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797559 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797582 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljp6k\" (UniqueName: \"kubernetes.io/projected/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-kube-api-access-ljp6k\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797626 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-proxy-tls\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797655 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d962f6fe-d955-483d-b149-976a11dd4922-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/225d1d1d-8168-4489-af91-6a87f28c39ed-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797711 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-client\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797750 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdt5\" (UniqueName: \"kubernetes.io/projected/cba11394-4e55-4edc-beec-750bddabc1d0-kube-api-access-2cdt5\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797780 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797808 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/225d1d1d-8168-4489-af91-6a87f28c39ed-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797842 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797873 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-images\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797908 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-945rx\" (UniqueName: \"kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797935 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797969 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798000 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/77ff4d6a-8c1e-440f-a78c-900c09587848-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798026 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225d1d1d-8168-4489-af91-6a87f28c39ed-config\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798053 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cba11394-4e55-4edc-beec-750bddabc1d0-audit-dir\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798078 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798359 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-audit-policies\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798391 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d962f6fe-d955-483d-b149-976a11dd4922-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljckj\" (UniqueName: \"kubernetes.io/projected/b55a13cf-03c6-46d9-b286-960a839b1558-kube-api-access-ljckj\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798492 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798530 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98nf\" (UniqueName: \"kubernetes.io/projected/7ef2d9f9-34f2-48a6-83eb-689c0fdcac66-kube-api-access-x98nf\") pod \"downloads-7954f5f757-tv8j9\" (UID: \"7ef2d9f9-34f2-48a6-83eb-689c0fdcac66\") " pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.799316 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.799331 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.799316 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.799456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-audit-policies\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.799791 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.800336 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ckmh2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.800429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.800681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.801080 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.801166 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.800353 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.801448 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-config\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.801540 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d962f6fe-d955-483d-b149-976a11dd4922-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.801805 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-k85np"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.802766 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cba11394-4e55-4edc-beec-750bddabc1d0-audit-dir\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.803744 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.803786 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-encryption-config\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.803761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.803940 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.804138 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.804159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-serving-cert\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.804332 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-serving-cert\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.804114 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.804477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-images\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.805337 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225d1d1d-8168-4489-af91-6a87f28c39ed-config\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.805534 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-trusted-ca\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.805983 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b55a13cf-03c6-46d9-b286-960a839b1558-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.805997 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5lf26"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.806067 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.806218 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.806316 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34f42578-fcc9-4539-add3-bca8deb6927b-metrics-tls\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.806298 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-client\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.806759 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.807030 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.807038 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.807171 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.808356 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.808487 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.808741 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.809496 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/225d1d1d-8168-4489-af91-6a87f28c39ed-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.810264 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.811048 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.813433 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.813640 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-proxy-tls\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.814307 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.814412 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d962f6fe-d955-483d-b149-976a11dd4922-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.815947 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.817099 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.818218 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.818910 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.819253 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.820529 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.821684 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.822737 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.823814 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7nh4t"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.825063 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lbvml"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.875283 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.875548 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.875367 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.878290 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.879498 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.883291 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wndb7"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.884950 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.886088 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7q8sx"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.888065 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.889245 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.890839 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.893698 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.897100 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4mw9f"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.898745 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7tj4j"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.899653 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.900842 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7tj4j"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.909132 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.928341 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.948351 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.969319 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.989379 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.001869 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75c3ba8d-4548-4407-9188-a785ef05da2c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.008818 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.028702 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.048354 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.069365 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.089212 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.098002 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/77ff4d6a-8c1e-440f-a78c-900c09587848-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.109544 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.129562 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.168165 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.188733 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.208737 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.262883 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79w4l\" (UniqueName: \"kubernetes.io/projected/37e7b911-da73-4f82-ad0c-d8707547b7a7-kube-api-access-79w4l\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.269861 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.288260 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.309305 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.329309 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.348997 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.369105 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.389329 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.409358 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.429436 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.448811 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.469189 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.489293 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.496666 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.510387 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.529452 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.548963 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.569138 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.589936 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.614985 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.629220 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.649894 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.669050 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.689131 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.710524 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.730077 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.742878 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jtftl"] Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.747580 4962 request.go:700] Waited for 1.015016016s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackage-server-manager-serving-cert&limit=500&resourceVersion=0 Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.749549 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.768815 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.789794 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.810165 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.830379 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.850160 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.869523 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.888632 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.909608 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.914358 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" event={"ID":"37e7b911-da73-4f82-ad0c-d8707547b7a7","Type":"ContainerStarted","Data":"6f9b750267fbada324f415b4eab8ccc588bfee1c71de79cbab7087db44b8d785"} Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.939694 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.948312 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.969509 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.989819 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.009509 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.028652 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.049065 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.070465 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.090417 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.109778 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.128565 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.149715 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.169194 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.190005 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.209332 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.228667 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.250890 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.270257 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.289784 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.309919 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.329388 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.349003 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.369487 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.389101 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.411100 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.428724 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.449842 4962 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.469483 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.522319 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smr5m\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-kube-api-access-smr5m\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.526082 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.529836 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.564495 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfswd\" (UniqueName: \"kubernetes.io/projected/474b1e5d-9a6f-4931-be66-8fb20c82ac60-kube-api-access-nfswd\") pod \"migrator-59844c95c7-dc74p\" (UID: \"474b1e5d-9a6f-4931-be66-8fb20c82ac60\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.582086 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.588947 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98nf\" (UniqueName: \"kubernetes.io/projected/7ef2d9f9-34f2-48a6-83eb-689c0fdcac66-kube-api-access-x98nf\") pod \"downloads-7954f5f757-tv8j9\" (UID: \"7ef2d9f9-34f2-48a6-83eb-689c0fdcac66\") " pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.599189 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.619648 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/225d1d1d-8168-4489-af91-6a87f28c39ed-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.623700 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98r7\" (UniqueName: \"kubernetes.io/projected/34f42578-fcc9-4539-add3-bca8deb6927b-kube-api-access-s98r7\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.655540 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.681062 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrg46\" (UniqueName: \"kubernetes.io/projected/75c3ba8d-4548-4407-9188-a785ef05da2c-kube-api-access-lrg46\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.683229 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.688864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljp6k\" (UniqueName: \"kubernetes.io/projected/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-kube-api-access-ljp6k\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.690528 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.699620 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.708845 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.710926 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-945rx\" (UniqueName: \"kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.714993 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.734568 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.735801 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.745792 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c9lr\" (UniqueName: \"kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.750410 4962 request.go:700] Waited for 1.948759718s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/serviceaccounts/multus-ac/token Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.763419 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dpx6\" (UniqueName: \"kubernetes.io/projected/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-kube-api-access-8dpx6\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.769418 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5sb4\" (UniqueName: \"kubernetes.io/projected/77ff4d6a-8c1e-440f-a78c-900c09587848-kube-api-access-s5sb4\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.800400 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljckj\" (UniqueName: \"kubernetes.io/projected/b55a13cf-03c6-46d9-b286-960a839b1558-kube-api-access-ljckj\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.806315 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdt5\" (UniqueName: \"kubernetes.io/projected/cba11394-4e55-4edc-beec-750bddabc1d0-kube-api-access-2cdt5\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.809475 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.828461 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.845463 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.858922 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.859007 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.866832 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.885689 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.889521 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.910812 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc"] Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.928385 4962 generic.go:334] "Generic (PLEG): container finished" podID="37e7b911-da73-4f82-ad0c-d8707547b7a7" containerID="9350a8e12c71e3a008abd7f495bb1ba136c90c080869abe3047b6b06cdcbfe9a" exitCode=0 Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.928561 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" event={"ID":"37e7b911-da73-4f82-ad0c-d8707547b7a7","Type":"ContainerDied","Data":"9350a8e12c71e3a008abd7f495bb1ba136c90c080869abe3047b6b06cdcbfe9a"} Feb 20 09:57:29 crc kubenswrapper[4962]: W0220 09:57:29.950032 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd962f6fe_d955_483d_b149_976a11dd4922.slice/crio-9b0c121e4cc8abab256f58ac54535860c0c99b918fb4ea374e13156c3f11b3ae WatchSource:0}: Error finding container 9b0c121e4cc8abab256f58ac54535860c0c99b918fb4ea374e13156c3f11b3ae: Status 404 returned error can't find the container with id 9b0c121e4cc8abab256f58ac54535860c0c99b918fb4ea374e13156c3f11b3ae Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.961995 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034462 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034565 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034585 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034631 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac7d606a-36a8-4608-918c-ed88eaf93a6d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034647 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1d0fd4e8-ba15-4d2f-9602-e887819ea423-machine-approver-tls\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034664 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034685 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmq2c\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-kube-api-access-pmq2c\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034838 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmtxm\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034865 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phch9\" (UniqueName: \"kubernetes.io/projected/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-kube-api-access-phch9\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5fb2\" (UniqueName: \"kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.035314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.035713 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.035869 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.036829 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g2cv\" (UniqueName: \"kubernetes.io/projected/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-kube-api-access-8g2cv\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.036869 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qxqx\" (UniqueName: \"kubernetes.io/projected/0e4e18be-a43b-492a-981e-b4f9aebff1ab-kube-api-access-7qxqx\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.036886 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.036949 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7d606a-36a8-4608-918c-ed88eaf93a6d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.036967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-serving-cert\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037010 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037027 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037042 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037095 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-auth-proxy-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037113 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037641 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh2t2\" (UniqueName: \"kubernetes.io/projected/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-kube-api-access-sh2t2\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037851 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037998 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038031 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3febff6-f15f-4ce8-825c-37d86b13c56d-trusted-ca\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038091 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e4e18be-a43b-492a-981e-b4f9aebff1ab-serving-cert\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038136 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038154 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6nlc\" (UniqueName: \"kubernetes.io/projected/1d0fd4e8-ba15-4d2f-9602-e887819ea423-kube-api-access-c6nlc\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038216 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-config\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038232 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e4e18be-a43b-492a-981e-b4f9aebff1ab-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038266 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgr4s\" (UniqueName: \"kubernetes.io/projected/ac7d606a-36a8-4608-918c-ed88eaf93a6d-kube-api-access-lgr4s\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038300 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038318 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-service-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038332 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038380 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038412 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3febff6-f15f-4ce8-825c-37d86b13c56d-metrics-tls\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.044816 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.051643 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.551551288 +0000 UTC m=+142.134023324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.147454 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.147719 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.647689484 +0000 UTC m=+142.230161320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148141 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-node-bootstrap-token\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148211 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g2cv\" (UniqueName: \"kubernetes.io/projected/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-kube-api-access-8g2cv\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148263 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qxqx\" (UniqueName: \"kubernetes.io/projected/0e4e18be-a43b-492a-981e-b4f9aebff1ab-kube-api-access-7qxqx\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148319 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148351 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7d606a-36a8-4608-918c-ed88eaf93a6d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148373 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674f40ed-74ed-48c2-8036-087ce9e16c94-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148389 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplfv\" (UniqueName: \"kubernetes.io/projected/319cf696-9a12-40dc-9f4a-d80fab9a97f8-kube-api-access-hplfv\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148487 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qbw\" (UniqueName: \"kubernetes.io/projected/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-kube-api-access-s9qbw\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148509 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-stats-auth\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148542 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148558 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148578 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrw8j\" (UniqueName: \"kubernetes.io/projected/8cb06d17-6188-4cca-84b7-f3d03abb20e8-kube-api-access-hrw8j\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148614 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148630 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-client\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148663 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3febff6-f15f-4ce8-825c-37d86b13c56d-trusted-ca\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148682 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/674f40ed-74ed-48c2-8036-087ce9e16c94-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148712 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqfmw\" (UniqueName: \"kubernetes.io/projected/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-kube-api-access-zqfmw\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148745 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e4e18be-a43b-492a-981e-b4f9aebff1ab-serving-cert\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148762 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-mountpoint-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148794 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148837 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-config-volume\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148854 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-srv-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148988 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6nlc\" (UniqueName: \"kubernetes.io/projected/1d0fd4e8-ba15-4d2f-9602-e887819ea423-kube-api-access-c6nlc\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149011 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149028 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149046 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvc5z\" (UniqueName: \"kubernetes.io/projected/7e5e4942-63be-4811-8aaa-d6b53a427541-kube-api-access-gvc5z\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149847 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-certs\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149890 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-socket-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149914 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0240e440-4be2-4607-99c4-636b65e78081-signing-cabundle\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.150486 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.150727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151120 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-config\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151143 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-config\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151173 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e5e4942-63be-4811-8aaa-d6b53a427541-service-ca-bundle\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151225 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-registration-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151385 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3721fc4d-6f04-458e-a74c-0fe816908414-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.151564 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.651555796 +0000 UTC m=+142.234027642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151638 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3febff6-f15f-4ce8-825c-37d86b13c56d-metrics-tls\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151680 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-profile-collector-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151705 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0240e440-4be2-4607-99c4-636b65e78081-signing-key\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151747 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.152833 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3febff6-f15f-4ce8-825c-37d86b13c56d-trusted-ca\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154030 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154098 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154267 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154386 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154441 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-plugins-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154480 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-srv-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154519 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e4e18be-a43b-492a-981e-b4f9aebff1ab-serving-cert\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154760 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154792 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckdpz\" (UniqueName: \"kubernetes.io/projected/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-kube-api-access-ckdpz\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154830 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pg8k\" (UniqueName: \"kubernetes.io/projected/32025b2b-9232-449f-b7bc-582d81d76430-kube-api-access-2pg8k\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154864 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7zbm\" (UniqueName: \"kubernetes.io/projected/3721fc4d-6f04-458e-a74c-0fe816908414-kube-api-access-v7zbm\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154922 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5fb2\" (UniqueName: \"kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.155293 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2knp\" (UniqueName: \"kubernetes.io/projected/daef1622-b612-4661-bb6a-63c5997d9a07-kube-api-access-j2knp\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.155412 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.155560 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.155660 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.155881 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7d606a-36a8-4608-918c-ed88eaf93a6d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.156067 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.156132 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.156155 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-webhook-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.157163 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.157923 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-serving-cert\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.157977 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-serving-cert\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158348 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158385 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158453 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158577 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-auth-proxy-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158610 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158661 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158997 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh2t2\" (UniqueName: \"kubernetes.io/projected/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-kube-api-access-sh2t2\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319cf696-9a12-40dc-9f4a-d80fab9a97f8-cert\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159317 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvk5g\" (UniqueName: \"kubernetes.io/projected/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-kube-api-access-qvk5g\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159343 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-csi-data-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159375 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159384 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-auth-proxy-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159409 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-serving-cert\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159412 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159634 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xxtf\" (UniqueName: \"kubernetes.io/projected/0240e440-4be2-4607-99c4-636b65e78081-kube-api-access-5xxtf\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159666 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-config\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159696 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e4e18be-a43b-492a-981e-b4f9aebff1ab-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159719 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjtth\" (UniqueName: \"kubernetes.io/projected/9c993e86-3068-4d07-84b3-655f8308b7ed-kube-api-access-kjtth\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159736 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-metrics-certs\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159771 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/674f40ed-74ed-48c2-8036-087ce9e16c94-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159791 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-service-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159815 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgr4s\" (UniqueName: \"kubernetes.io/projected/ac7d606a-36a8-4608-918c-ed88eaf93a6d-kube-api-access-lgr4s\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160110 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e4e18be-a43b-492a-981e-b4f9aebff1ab-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160291 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160326 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-config\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160394 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160416 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-service-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160458 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-images\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160476 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-metrics-tls\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160570 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160675 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160796 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt6vz\" (UniqueName: \"kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160837 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-apiservice-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160868 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-service-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161556 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jgvn\" (UniqueName: \"kubernetes.io/projected/a7a9fa76-da75-4847-a539-d1e6bb57da98-kube-api-access-9jgvn\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161583 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161615 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9ffd\" (UniqueName: \"kubernetes.io/projected/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-kube-api-access-l9ffd\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161647 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161663 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcnb5\" (UniqueName: \"kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161702 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac7d606a-36a8-4608-918c-ed88eaf93a6d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c993e86-3068-4d07-84b3-655f8308b7ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161764 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1d0fd4e8-ba15-4d2f-9602-e887819ea423-machine-approver-tls\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161782 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-tmpfs\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161802 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161823 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161855 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmq2c\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-kube-api-access-pmq2c\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161873 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-default-certificate\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161893 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-config\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161911 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmtxm\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161929 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phch9\" (UniqueName: \"kubernetes.io/projected/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-kube-api-access-phch9\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161945 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-config\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161961 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7a9fa76-da75-4847-a539-d1e6bb57da98-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161977 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3721fc4d-6f04-458e-a74c-0fe816908414-proxy-tls\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.162013 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.162962 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.165581 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.165615 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.171507 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.176926 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-serving-cert\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.176965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac7d606a-36a8-4608-918c-ed88eaf93a6d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.178077 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3febff6-f15f-4ce8-825c-37d86b13c56d-metrics-tls\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.181943 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1d0fd4e8-ba15-4d2f-9602-e887819ea423-machine-approver-tls\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.182145 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tv8j9"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.191868 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g2cv\" (UniqueName: \"kubernetes.io/projected/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-kube-api-access-8g2cv\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.191974 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod225d1d1d_8168_4489_af91_6a87f28c39ed.slice/crio-979988254c99b59b69c15f468591a0b229b094bd2ae9aa467a9aa5dbc5efbaaa WatchSource:0}: Error finding container 979988254c99b59b69c15f468591a0b229b094bd2ae9aa467a9aa5dbc5efbaaa: Status 404 returned error can't find the container with id 979988254c99b59b69c15f468591a0b229b094bd2ae9aa467a9aa5dbc5efbaaa Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.209387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6nlc\" (UniqueName: \"kubernetes.io/projected/1d0fd4e8-ba15-4d2f-9602-e887819ea423-kube-api-access-c6nlc\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.241188 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.262840 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.262930 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-profile-collector-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.262956 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0240e440-4be2-4607-99c4-636b65e78081-signing-key\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.262984 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-plugins-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263000 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-srv-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263017 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263033 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdpz\" (UniqueName: \"kubernetes.io/projected/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-kube-api-access-ckdpz\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263049 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pg8k\" (UniqueName: \"kubernetes.io/projected/32025b2b-9232-449f-b7bc-582d81d76430-kube-api-access-2pg8k\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263065 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7zbm\" (UniqueName: \"kubernetes.io/projected/3721fc4d-6f04-458e-a74c-0fe816908414-kube-api-access-v7zbm\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263090 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2knp\" (UniqueName: \"kubernetes.io/projected/daef1622-b612-4661-bb6a-63c5997d9a07-kube-api-access-j2knp\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-webhook-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-serving-cert\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263153 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263176 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319cf696-9a12-40dc-9f4a-d80fab9a97f8-cert\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263194 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263217 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvk5g\" (UniqueName: \"kubernetes.io/projected/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-kube-api-access-qvk5g\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263231 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-csi-data-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263247 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-serving-cert\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263274 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xxtf\" (UniqueName: \"kubernetes.io/projected/0240e440-4be2-4607-99c4-636b65e78081-kube-api-access-5xxtf\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263292 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtth\" (UniqueName: \"kubernetes.io/projected/9c993e86-3068-4d07-84b3-655f8308b7ed-kube-api-access-kjtth\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263318 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-metrics-certs\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263336 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/674f40ed-74ed-48c2-8036-087ce9e16c94-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263351 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-service-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263374 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-images\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263392 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-metrics-tls\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263410 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-apiservice-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263428 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt6vz\" (UniqueName: \"kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263456 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jgvn\" (UniqueName: \"kubernetes.io/projected/a7a9fa76-da75-4847-a539-d1e6bb57da98-kube-api-access-9jgvn\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263477 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263496 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9ffd\" (UniqueName: \"kubernetes.io/projected/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-kube-api-access-l9ffd\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263514 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcnb5\" (UniqueName: \"kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c993e86-3068-4d07-84b3-655f8308b7ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263549 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-tmpfs\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263580 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-default-certificate\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263614 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-config\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263644 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-config\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.265363 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.765331801 +0000 UTC m=+142.347803637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.265771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-config\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.266748 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-serving-cert\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.267441 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-images\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.270247 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-service-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.270250 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-config\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.270536 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7a9fa76-da75-4847-a539-d1e6bb57da98-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271160 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3721fc4d-6f04-458e-a74c-0fe816908414-proxy-tls\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271197 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-node-bootstrap-token\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271272 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271272 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-metrics-certs\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674f40ed-74ed-48c2-8036-087ce9e16c94-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271328 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hplfv\" (UniqueName: \"kubernetes.io/projected/319cf696-9a12-40dc-9f4a-d80fab9a97f8-kube-api-access-hplfv\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271356 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qbw\" (UniqueName: \"kubernetes.io/projected/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-kube-api-access-s9qbw\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271385 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-stats-auth\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271411 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271434 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrw8j\" (UniqueName: \"kubernetes.io/projected/8cb06d17-6188-4cca-84b7-f3d03abb20e8-kube-api-access-hrw8j\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271453 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271470 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-client\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271500 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/674f40ed-74ed-48c2-8036-087ce9e16c94-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271538 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqfmw\" (UniqueName: \"kubernetes.io/projected/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-kube-api-access-zqfmw\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271561 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-mountpoint-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271621 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-config-volume\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271642 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-srv-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271680 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271708 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvc5z\" (UniqueName: \"kubernetes.io/projected/7e5e4942-63be-4811-8aaa-d6b53a427541-kube-api-access-gvc5z\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0240e440-4be2-4607-99c4-636b65e78081-signing-cabundle\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271761 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-certs\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271756 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-csi-data-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271782 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-socket-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271808 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-config\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271830 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-config\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271855 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e5e4942-63be-4811-8aaa-d6b53a427541-service-ca-bundle\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271874 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-registration-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271903 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271923 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3721fc4d-6f04-458e-a74c-0fe816908414-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.272472 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-metrics-tls\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.273176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3721fc4d-6f04-458e-a74c-0fe816908414-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271098 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-plugins-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.270826 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-tmpfs\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.273829 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-mountpoint-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.274432 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674f40ed-74ed-48c2-8036-087ce9e16c94-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.276804 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.277260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.277919 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-config\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.278101 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-registration-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.278225 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-srv-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.278531 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.778516489 +0000 UTC m=+142.360988335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.279068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e5e4942-63be-4811-8aaa-d6b53a427541-service-ca-bundle\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.280242 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-config-volume\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.281328 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319cf696-9a12-40dc-9f4a-d80fab9a97f8-cert\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.281663 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-config\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.282083 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-serving-cert\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.282096 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-webhook-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.282387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.282786 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.282944 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c993e86-3068-4d07-84b3-655f8308b7ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.283286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-client\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.283564 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0240e440-4be2-4607-99c4-636b65e78081-signing-cabundle\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.283671 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-socket-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.283709 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.284423 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-certs\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.284711 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-node-bootstrap-token\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.286829 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.287636 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-stats-auth\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.287759 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3721fc4d-6f04-458e-a74c-0fe816908414-proxy-tls\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.287796 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.291218 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-srv-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.291217 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-profile-collector-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.292206 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qxqx\" (UniqueName: \"kubernetes.io/projected/0e4e18be-a43b-492a-981e-b4f9aebff1ab-kube-api-access-7qxqx\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.292469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-default-certificate\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.293734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5fb2\" (UniqueName: \"kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.294335 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0240e440-4be2-4607-99c4-636b65e78081-signing-key\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.295013 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7a9fa76-da75-4847-a539-d1e6bb57da98-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.297181 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/674f40ed-74ed-48c2-8036-087ce9e16c94-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.297324 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-apiservice-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.304926 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.310096 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh2t2\" (UniqueName: \"kubernetes.io/projected/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-kube-api-access-sh2t2\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.318858 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68c1fde1_72ce_4ce0_ade8_9c8e7016464c.slice/crio-e8d37a6a57fda0f8fde14fa0441e67ccd1acf7446ab5259542d4c716ddc9be67 WatchSource:0}: Error finding container e8d37a6a57fda0f8fde14fa0441e67ccd1acf7446ab5259542d4c716ddc9be67: Status 404 returned error can't find the container with id e8d37a6a57fda0f8fde14fa0441e67ccd1acf7446ab5259542d4c716ddc9be67 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.321020 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.327638 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgr4s\" (UniqueName: \"kubernetes.io/projected/ac7d606a-36a8-4608-918c-ed88eaf93a6d-kube-api-access-lgr4s\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.337806 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5lf26"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.345185 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmq2c\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-kube-api-access-pmq2c\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.346763 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34f42578_fcc9_4539_add3_bca8deb6927b.slice/crio-266c0ffa64080b0de26767431ef3a6eaf68c965c0e6155bbfba299065fa5d499 WatchSource:0}: Error finding container 266c0ffa64080b0de26767431ef3a6eaf68c965c0e6155bbfba299065fa5d499: Status 404 returned error can't find the container with id 266c0ffa64080b0de26767431ef3a6eaf68c965c0e6155bbfba299065fa5d499 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.351821 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.366011 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmtxm\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.376488 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.377033 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.390997 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.890955251 +0000 UTC m=+142.473427097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.391187 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.393829 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.398494 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.401276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.408252 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.411414 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phch9\" (UniqueName: \"kubernetes.io/projected/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-kube-api-access-phch9\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.428771 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.431362 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xxtf\" (UniqueName: \"kubernetes.io/projected/0240e440-4be2-4607-99c4-636b65e78081-kube-api-access-5xxtf\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.434099 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod474b1e5d_9a6f_4931_be66_8fb20c82ac60.slice/crio-9828c0062499d2fa3278ca8dc82309caab70298604132cea50fbc62b70815ba1 WatchSource:0}: Error finding container 9828c0062499d2fa3278ca8dc82309caab70298604132cea50fbc62b70815ba1: Status 404 returned error can't find the container with id 9828c0062499d2fa3278ca8dc82309caab70298604132cea50fbc62b70815ba1 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.451885 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wndb7"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.453697 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjtth\" (UniqueName: \"kubernetes.io/projected/9c993e86-3068-4d07-84b3-655f8308b7ed-kube-api-access-kjtth\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.466783 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jgvn\" (UniqueName: \"kubernetes.io/projected/a7a9fa76-da75-4847-a539-d1e6bb57da98-kube-api-access-9jgvn\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.474837 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.480004 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.480364 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.980332672 +0000 UTC m=+142.562804708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.481306 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77ff4d6a_8c1e_440f_a78c_900c09587848.slice/crio-efff1e8c2910c51e6d22e0d46f64557daaaa8ef4ff3bf6b1f9553b0d1f8ca4b1 WatchSource:0}: Error finding container efff1e8c2910c51e6d22e0d46f64557daaaa8ef4ff3bf6b1f9553b0d1f8ca4b1: Status 404 returned error can't find the container with id efff1e8c2910c51e6d22e0d46f64557daaaa8ef4ff3bf6b1f9553b0d1f8ca4b1 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.482724 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/674f40ed-74ed-48c2-8036-087ce9e16c94-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.494407 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.505293 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt6vz\" (UniqueName: \"kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.527365 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2knp\" (UniqueName: \"kubernetes.io/projected/daef1622-b612-4661-bb6a-63c5997d9a07-kube-api-access-j2knp\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.557181 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckdpz\" (UniqueName: \"kubernetes.io/projected/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-kube-api-access-ckdpz\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.563138 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d0fd4e8_ba15_4d2f_9602_e887819ea423.slice/crio-7978d084b8b8a0d04cf89ebef1ecc3a95cb54ae2d704f90d8c3142f909490fc2 WatchSource:0}: Error finding container 7978d084b8b8a0d04cf89ebef1ecc3a95cb54ae2d704f90d8c3142f909490fc2: Status 404 returned error can't find the container with id 7978d084b8b8a0d04cf89ebef1ecc3a95cb54ae2d704f90d8c3142f909490fc2 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.563162 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.572631 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-k85np"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.576379 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pg8k\" (UniqueName: \"kubernetes.io/projected/32025b2b-9232-449f-b7bc-582d81d76430-kube-api-access-2pg8k\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.580815 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.581278 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.08126269 +0000 UTC m=+142.663734536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.581378 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.597330 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.598744 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.600346 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7zbm\" (UniqueName: \"kubernetes.io/projected/3721fc4d-6f04-458e-a74c-0fe816908414-kube-api-access-v7zbm\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.609874 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.621092 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvk5g\" (UniqueName: \"kubernetes.io/projected/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-kube-api-access-qvk5g\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.624453 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.625284 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.647054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9ffd\" (UniqueName: \"kubernetes.io/projected/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-kube-api-access-l9ffd\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.664176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcnb5\" (UniqueName: \"kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.665084 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.672417 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.679728 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.682522 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.682905 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.18288746 +0000 UTC m=+142.765359306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.686440 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.688505 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplfv\" (UniqueName: \"kubernetes.io/projected/319cf696-9a12-40dc-9f4a-d80fab9a97f8-kube-api-access-hplfv\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.702725 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.708320 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qbw\" (UniqueName: \"kubernetes.io/projected/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-kube-api-access-s9qbw\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.708758 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.714779 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.721791 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.726702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrw8j\" (UniqueName: \"kubernetes.io/projected/8cb06d17-6188-4cca-84b7-f3d03abb20e8-kube-api-access-hrw8j\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.730434 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.733664 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z"] Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.734774 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8161f87_3814_4d02_84ff_b94b8b05c59e.slice/crio-5c096e9566721f19e8e59886a3dcebbecb0051a2d044d1f9485cf0be8b3c8877 WatchSource:0}: Error finding container 5c096e9566721f19e8e59886a3dcebbecb0051a2d044d1f9485cf0be8b3c8877: Status 404 returned error can't find the container with id 5c096e9566721f19e8e59886a3dcebbecb0051a2d044d1f9485cf0be8b3c8877 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.736539 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.749401 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.755995 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqfmw\" (UniqueName: \"kubernetes.io/projected/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-kube-api-access-zqfmw\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.762741 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.771545 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvc5z\" (UniqueName: \"kubernetes.io/projected/7e5e4942-63be-4811-8aaa-d6b53a427541-kube-api-access-gvc5z\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.783277 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.783474 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.783699 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.283683703 +0000 UTC m=+142.866155549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.793570 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.800963 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.885609 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.886126 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.386107878 +0000 UTC m=+142.968579724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.940261 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" event={"ID":"6adbe475-48f9-4ba3-82bd-b36bcd939168","Type":"ContainerStarted","Data":"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.940346 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" event={"ID":"6adbe475-48f9-4ba3-82bd-b36bcd939168","Type":"ContainerStarted","Data":"3bea97da1320becf13fecaed38868cc74c4f54c7308979ccb795e3bbe8eacf06"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.940946 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.947620 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tv8j9" event={"ID":"7ef2d9f9-34f2-48a6-83eb-689c0fdcac66","Type":"ContainerStarted","Data":"d7df83fe690c7504f718b1b7a49e7fb1d729ec2cbbbbbf7011a63fe6ad057d6b"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.947691 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tv8j9" event={"ID":"7ef2d9f9-34f2-48a6-83eb-689c0fdcac66","Type":"ContainerStarted","Data":"391beeaa328a3138fd621ffd774f8909b7ab17f8cb49f6d3a5048995abcee98c"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.948867 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.953105 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" event={"ID":"75c3ba8d-4548-4407-9188-a785ef05da2c","Type":"ContainerStarted","Data":"7511c197cf19c9bdfdd3db34b5c3ac3859b3a69eee3be1ed5e0a4ecc5cbfc156"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.958319 4962 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-szbwm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.958342 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-tv8j9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.958380 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tv8j9" podUID="7ef2d9f9-34f2-48a6-83eb-689c0fdcac66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.958380 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.959030 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.959614 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" event={"ID":"474b1e5d-9a6f-4931-be66-8fb20c82ac60","Type":"ContainerStarted","Data":"9828c0062499d2fa3278ca8dc82309caab70298604132cea50fbc62b70815ba1"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.961608 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tp9zq"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.966753 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" event={"ID":"77ff4d6a-8c1e-440f-a78c-900c09587848","Type":"ContainerStarted","Data":"efff1e8c2910c51e6d22e0d46f64557daaaa8ef4ff3bf6b1f9553b0d1f8ca4b1"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.968149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" event={"ID":"ac7d606a-36a8-4608-918c-ed88eaf93a6d","Type":"ContainerStarted","Data":"661bc4f85372d8f949cd023db00242c6c96a2e76d1b45cee6f14cb90bb9f7255"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.976550 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" event={"ID":"8da2028c-f296-4f44-b010-b3abec9f6b98","Type":"ContainerStarted","Data":"c9ca7261143890db86b7247b8197f46263302fc4c677314a7e1a1eadf9f9acf2"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.989225 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.991046 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.491027642 +0000 UTC m=+143.073499488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.996327 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.007790 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" event={"ID":"34f42578-fcc9-4539-add3-bca8deb6927b","Type":"ContainerStarted","Data":"266c0ffa64080b0de26767431ef3a6eaf68c965c0e6155bbfba299065fa5d499"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.039115 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" event={"ID":"37e7b911-da73-4f82-ad0c-d8707547b7a7","Type":"ContainerStarted","Data":"7015c1e84ed6e18e3b8cde213cbeb55a16a919347d04f6b8478889b3d4e6940a"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.053139 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-k85np" event={"ID":"dce6ddda-3fcf-40bd-a085-a09f0bb811bf","Type":"ContainerStarted","Data":"3cf4d667ac59419a36246906693804cbe14bf8d01c86c599bfb479efa95801d7"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.055056 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" event={"ID":"e60f8ca8-5b2f-4b5c-930f-19caf45014ba","Type":"ContainerStarted","Data":"10f5a9c41b0882c8fded0f55c672a5d8b02eef3255b1b1f47cce019ae1469341"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.058501 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.080605 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.091819 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.092474 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.592455965 +0000 UTC m=+143.174927811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.106644 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" event={"ID":"225d1d1d-8168-4489-af91-6a87f28c39ed","Type":"ContainerStarted","Data":"eb1d22c22a294b7deb745d7b54826225299ad5427a1bbcf15e543edc5f3dcc2d"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.106679 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" event={"ID":"225d1d1d-8168-4489-af91-6a87f28c39ed","Type":"ContainerStarted","Data":"979988254c99b59b69c15f468591a0b229b094bd2ae9aa467a9aa5dbc5efbaaa"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.108694 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" event={"ID":"1d0fd4e8-ba15-4d2f-9602-e887819ea423","Type":"ContainerStarted","Data":"7978d084b8b8a0d04cf89ebef1ecc3a95cb54ae2d704f90d8c3142f909490fc2"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.111553 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" event={"ID":"f8161f87-3814-4d02-84ff-b94b8b05c59e","Type":"ContainerStarted","Data":"5c096e9566721f19e8e59886a3dcebbecb0051a2d044d1f9485cf0be8b3c8877"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.114972 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" event={"ID":"cba11394-4e55-4edc-beec-750bddabc1d0","Type":"ContainerStarted","Data":"018e8f198a3dd5320e311eef6f8370fe38bcba79bb6f1a512897862b6b92b75d"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.169494 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" event={"ID":"68c1fde1-72ce-4ce0-ade8-9c8e7016464c","Type":"ContainerStarted","Data":"84c2c333f020a07a49798fe3eb6487df75a4cccc90dae7ecdb0347cd6f11a48f"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.169551 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" event={"ID":"68c1fde1-72ce-4ce0-ade8-9c8e7016464c","Type":"ContainerStarted","Data":"e8d37a6a57fda0f8fde14fa0441e67ccd1acf7446ab5259542d4c716ddc9be67"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.169577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" event={"ID":"d962f6fe-d955-483d-b149-976a11dd4922","Type":"ContainerStarted","Data":"aae3072d24cd492e29e371f694952c0b1ae60073bfa2edb5ff693cf52c8c575b"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.169627 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" event={"ID":"d962f6fe-d955-483d-b149-976a11dd4922","Type":"ContainerStarted","Data":"9b0c121e4cc8abab256f58ac54535860c0c99b918fb4ea374e13156c3f11b3ae"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.197315 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.197520 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.697476542 +0000 UTC m=+143.279948388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.198311 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.200240 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.700219049 +0000 UTC m=+143.282690885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.221504 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.311944 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.312340 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.812322221 +0000 UTC m=+143.394794067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.374661 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.436059 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.436540 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.936520875 +0000 UTC m=+143.518992721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.540442 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.540609 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.040568581 +0000 UTC m=+143.623040427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.540880 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.541197 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.041188581 +0000 UTC m=+143.623660427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: W0220 09:57:31.607034 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3721fc4d_6f04_458e_a74c_0fe816908414.slice/crio-5f5a9d6123b955c36501b6acb39515b93e5b361ffb3adc8ed4ec1d86b16f075f WatchSource:0}: Error finding container 5f5a9d6123b955c36501b6acb39515b93e5b361ffb3adc8ed4ec1d86b16f075f: Status 404 returned error can't find the container with id 5f5a9d6123b955c36501b6acb39515b93e5b361ffb3adc8ed4ec1d86b16f075f Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.647639 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.648077 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.148061437 +0000 UTC m=+143.730533283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.693544 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-tv8j9" podStartSLOduration=122.693529528 podStartE2EDuration="2m2.693529528s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:31.652636682 +0000 UTC m=+143.235108528" watchObservedRunningTime="2026-02-20 09:57:31.693529528 +0000 UTC m=+143.276001374" Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.697436 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.748974 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.750296 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.250283005 +0000 UTC m=+143.832754851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.821991 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.826488 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.851194 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.851611 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.351582794 +0000 UTC m=+143.934054640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.952662 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.953515 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.453496663 +0000 UTC m=+144.035968519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: W0220 09:57:31.960412 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda90b20e7_a8bc_4b8d_b407_f4f31fc96528.slice/crio-61d226a59e4a534c419d5ea1f4e66a3ee74e9b9f143032d000d2437cc6618af5 WatchSource:0}: Error finding container 61d226a59e4a534c419d5ea1f4e66a3ee74e9b9f143032d000d2437cc6618af5: Status 404 returned error can't find the container with id 61d226a59e4a534c419d5ea1f4e66a3ee74e9b9f143032d000d2437cc6618af5 Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.054614 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.054762 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.554734591 +0000 UTC m=+144.137206437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.054992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.055343 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.555326029 +0000 UTC m=+144.137797875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.156716 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.157815 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.657795536 +0000 UTC m=+144.240267382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.200492 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" event={"ID":"75c3ba8d-4548-4407-9188-a785ef05da2c","Type":"ContainerStarted","Data":"1825648d2577639b0eef3e6e3d827d473411acb221b78700035f5d33dd279261"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.204132 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" event={"ID":"68c1fde1-72ce-4ce0-ade8-9c8e7016464c","Type":"ContainerStarted","Data":"d30ea5fb486c0fc821ae6a1b6ba524bb01c9c8f724e754fe5a584a0d1c8fe783"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.210564 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" event={"ID":"474b1e5d-9a6f-4931-be66-8fb20c82ac60","Type":"ContainerStarted","Data":"8804485d7428611684c404d2b65ab54f06c494819854af7f13ec9e737899d467"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.211954 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-k85np" event={"ID":"dce6ddda-3fcf-40bd-a085-a09f0bb811bf","Type":"ContainerStarted","Data":"1a56689b77c2b8e93534ee778609bb99633a7ca90b3cee7a24f40467bc915ef2"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.212731 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.215288 4962 patch_prober.go:28] interesting pod/console-operator-58897d9998-k85np container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.215395 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-k85np" podUID="dce6ddda-3fcf-40bd-a085-a09f0bb811bf" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.231054 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" event={"ID":"b55a13cf-03c6-46d9-b286-960a839b1558","Type":"ContainerStarted","Data":"3119b7ebea6326ada31bb463c372a5d9f7a6209ab5ce0c48cc94342a9d2942e5"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.253294 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" podStartSLOduration=123.253275631 podStartE2EDuration="2m3.253275631s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.242834949 +0000 UTC m=+143.825306795" watchObservedRunningTime="2026-02-20 09:57:32.253275631 +0000 UTC m=+143.835747477" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.258814 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.259064 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.759053374 +0000 UTC m=+144.341525210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.275143 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" podStartSLOduration=122.275128223 podStartE2EDuration="2m2.275128223s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.274058049 +0000 UTC m=+143.856529895" watchObservedRunningTime="2026-02-20 09:57:32.275128223 +0000 UTC m=+143.857600069" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.277097 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" event={"ID":"37e7b911-da73-4f82-ad0c-d8707547b7a7","Type":"ContainerStarted","Data":"cd56a03ac3d90ac4add9269473ce5a604989c53b0a0255f1569b9bc0f7ebe2fb"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.299445 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" event={"ID":"7c85c4ba-4bcb-4449-bd63-320f2ff6a116","Type":"ContainerStarted","Data":"e8bddac0932763828dcb0af23a368dc6c942d28566917752b05ca0c3dddcdfdc"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.302049 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" event={"ID":"ac7d606a-36a8-4608-918c-ed88eaf93a6d","Type":"ContainerStarted","Data":"ccd0adf9d96a51dc5147431e994c9a007279956cfb3aa6b6ea7ae8ddbd115871"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.303499 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tcwqj" event={"ID":"7e5e4942-63be-4811-8aaa-d6b53a427541","Type":"ContainerStarted","Data":"a375dfaf77c6ef6694cbb1091ae581f3820131e5754051883f0ac4cf32012273"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.306206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-l92fq" event={"ID":"32025b2b-9232-449f-b7bc-582d81d76430","Type":"ContainerStarted","Data":"8d13176361d41e90849a4d2ef515174dffaf593bc753444cb3080b0a89087860"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.327835 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" podStartSLOduration=123.327806902 podStartE2EDuration="2m3.327806902s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.311171515 +0000 UTC m=+143.893643351" watchObservedRunningTime="2026-02-20 09:57:32.327806902 +0000 UTC m=+143.910278748" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.332431 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2"] Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.336224 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" event={"ID":"0e4e18be-a43b-492a-981e-b4f9aebff1ab","Type":"ContainerStarted","Data":"0a30e34f3229c122e5231b38f019a2b8de7af2ab36e53adcd1f2873936254083"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.350619 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" event={"ID":"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1","Type":"ContainerStarted","Data":"d5487650c2e8f26da51d7259619288f7bbcb2d90de5bc8676f9c2455cb9b2574"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.354733 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" event={"ID":"9e86193f-b3bb-42a8-bccb-00e0cbcbf432","Type":"ContainerStarted","Data":"303f7374a29dc9fa9bf4ee1d09da3f40eae761f7dca4a2d598623fbb79e737a0"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.360250 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw"] Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.361188 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.361439 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.861418456 +0000 UTC m=+144.443890302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.361992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.364072 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.864038989 +0000 UTC m=+144.446519366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.365946 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" event={"ID":"3721fc4d-6f04-458e-a74c-0fe816908414","Type":"ContainerStarted","Data":"5f5a9d6123b955c36501b6acb39515b93e5b361ffb3adc8ed4ec1d86b16f075f"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.367639 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" podStartSLOduration=123.367626313 podStartE2EDuration="2m3.367626313s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.366855519 +0000 UTC m=+143.949327365" watchObservedRunningTime="2026-02-20 09:57:32.367626313 +0000 UTC m=+143.950098159" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.383950 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" event={"ID":"8da2028c-f296-4f44-b010-b3abec9f6b98","Type":"ContainerStarted","Data":"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.384834 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.386077 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwfk6" event={"ID":"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a","Type":"ContainerStarted","Data":"7a175f5752b9da8dd07abe01e0077ca08911cfa3fb3fa2f627ad42bdc14904eb"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.393309 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" event={"ID":"34f42578-fcc9-4539-add3-bca8deb6927b","Type":"ContainerStarted","Data":"c85de57c4db8642fcb0546ac917d9542ad8dd88da3285417835c7294f2fdcb4b"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.394644 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" event={"ID":"e60f8ca8-5b2f-4b5c-930f-19caf45014ba","Type":"ContainerStarted","Data":"62421acd7645aecb362856d5c4c72323a0863ab4a460f226e65cec6ba0c4b0ed"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.400708 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" event={"ID":"a90b20e7-a8bc-4b8d-b407-f4f31fc96528","Type":"ContainerStarted","Data":"61d226a59e4a534c419d5ea1f4e66a3ee74e9b9f143032d000d2437cc6618af5"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.404522 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-tv8j9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.404580 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tv8j9" podUID="7ef2d9f9-34f2-48a6-83eb-689c0fdcac66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.409623 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf"] Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.462737 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.463299 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.963220542 +0000 UTC m=+144.545692388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.464029 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.467566 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.967549909 +0000 UTC m=+144.550021755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.497642 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.523058 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-k85np" podStartSLOduration=123.523040427 podStartE2EDuration="2m3.523040427s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.507555476 +0000 UTC m=+144.090027322" watchObservedRunningTime="2026-02-20 09:57:32.523040427 +0000 UTC m=+144.105512263" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.565543 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.575904 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.07586286 +0000 UTC m=+144.658334706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.601203 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.659533 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" podStartSLOduration=123.659513611 podStartE2EDuration="2m3.659513611s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.659311804 +0000 UTC m=+144.241783650" watchObservedRunningTime="2026-02-20 09:57:32.659513611 +0000 UTC m=+144.241985457" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.671111 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.671522 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.17150619 +0000 UTC m=+144.753978026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.742561 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" podStartSLOduration=122.742545041 podStartE2EDuration="2m2.742545041s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.738872834 +0000 UTC m=+144.321344680" watchObservedRunningTime="2026-02-20 09:57:32.742545041 +0000 UTC m=+144.325016887" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.776537 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.777011 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.276988002 +0000 UTC m=+144.859459848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.881116 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.881826 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.381809023 +0000 UTC m=+144.964280859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.983193 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.983838 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.483815345 +0000 UTC m=+145.066287191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.997875 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7tj4j"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.015459 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" podStartSLOduration=123.015432176 podStartE2EDuration="2m3.015432176s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:33.013793444 +0000 UTC m=+144.596265290" watchObservedRunningTime="2026-02-20 09:57:33.015432176 +0000 UTC m=+144.597904022" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.037351 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.060185 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4mw9f"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.073247 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7nh4t"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.081756 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" podStartSLOduration=123.081730457 podStartE2EDuration="2m3.081730457s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:33.077225853 +0000 UTC m=+144.659697699" watchObservedRunningTime="2026-02-20 09:57:33.081730457 +0000 UTC m=+144.664202303" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.099469 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.100145 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.600126349 +0000 UTC m=+145.182598195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.105995 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7q8sx"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.109925 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lbvml"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.109971 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.112835 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.117438 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" podStartSLOduration=124.117420987 podStartE2EDuration="2m4.117420987s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:33.11496364 +0000 UTC m=+144.697435486" watchObservedRunningTime="2026-02-20 09:57:33.117420987 +0000 UTC m=+144.699892833" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.121578 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.165244 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ckmh2"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.200151 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.201527 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.701497131 +0000 UTC m=+145.283968977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.201631 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.202098 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.702077569 +0000 UTC m=+145.284549415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.207088 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.303249 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.303489 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.803462282 +0000 UTC m=+145.385934128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.303574 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.303978 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.803965367 +0000 UTC m=+145.386437213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.404791 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.405201 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.905156953 +0000 UTC m=+145.487628799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.405490 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.406937 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.906928349 +0000 UTC m=+145.489400195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.426343 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" event={"ID":"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800","Type":"ContainerStarted","Data":"443b3cac7afd3608f314f45094307c5aed4aec3bb9bcf251c449de2e6b14bd36"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.430752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" event={"ID":"c3febff6-f15f-4ce8-825c-37d86b13c56d","Type":"ContainerStarted","Data":"f2f1692cd20ad2c1f8b5bc92645335660fd9045760c4ee971c36e503d0c5cc41"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.435735 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" event={"ID":"a7a9fa76-da75-4847-a539-d1e6bb57da98","Type":"ContainerStarted","Data":"1e420be70bfd095b03c94c8eabb6da93d1778cb968c541a4d24f98c60cf0a857"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.440036 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" event={"ID":"8cb06d17-6188-4cca-84b7-f3d03abb20e8","Type":"ContainerStarted","Data":"890ef7d22eed2deafad0de9e66c59e509bb9078677f393ebf15cc4bb08f0bf11"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.441306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" event={"ID":"0240e440-4be2-4607-99c4-636b65e78081","Type":"ContainerStarted","Data":"258bdae986515b2708b3f343587de68829ab40bdb5d44138b42c82fc1fad25f3"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.447579 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" event={"ID":"bfd57a5c-0892-46a0-8005-0a8f70c146fd","Type":"ContainerStarted","Data":"c81440f2bd45daadf6efa1fe9a3de8fa8cfa794ff12c8106c2aad73b69faa130"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.465994 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" event={"ID":"1d0fd4e8-ba15-4d2f-9602-e887819ea423","Type":"ContainerStarted","Data":"653c2aac4092a0a1999cbb7055660e2eec23fd820e1d9df6c3ac9693bb4d11cc"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.466938 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" event={"ID":"daef1622-b612-4661-bb6a-63c5997d9a07","Type":"ContainerStarted","Data":"6a1d062b6193d4c5c8f9d78717ac691b2c631795050bcf3ba212d233beabe607"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.468080 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" event={"ID":"674f40ed-74ed-48c2-8036-087ce9e16c94","Type":"ContainerStarted","Data":"139c754b94229153f4b84281a788404da840094bb1d0a105262d877cf8cb5a0c"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.475864 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tcwqj" event={"ID":"7e5e4942-63be-4811-8aaa-d6b53a427541","Type":"ContainerStarted","Data":"d3889e2e711be3549254cc518553b9bfd0b57367648743f0a855f2983147e2f3"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.478306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" event={"ID":"5df91f4a-70e8-4036-8ab1-d917af6c8aa4","Type":"ContainerStarted","Data":"ddded4e705c2b3fd75ddf8541ef962e4a3ecd989a60a84d48a40e7b20b9a8421"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.492489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4mw9f" event={"ID":"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c","Type":"ContainerStarted","Data":"8b93339f1d51a9d0cb8f225cdc9c09e5eb5a504bc88354f5124e186878dacb81"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.499571 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.499648 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.506669 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.507227 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.007207316 +0000 UTC m=+145.589679162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.508477 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tcwqj" podStartSLOduration=124.508455295 podStartE2EDuration="2m4.508455295s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:33.506604927 +0000 UTC m=+145.089076793" watchObservedRunningTime="2026-02-20 09:57:33.508455295 +0000 UTC m=+145.090927141" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.509350 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" event={"ID":"a3652dbd-dae4-462b-be88-b8a782de8a1c","Type":"ContainerStarted","Data":"9cb82e35bd3f3d4fb0aaea07b6c95b315a4035aa82123595066c03e2ec02bbc3"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.515721 4962 patch_prober.go:28] interesting pod/apiserver-76f77b778f-jtftl container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]log ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]etcd ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/generic-apiserver-start-informers ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/max-in-flight-filter ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 20 09:57:33 crc kubenswrapper[4962]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 20 09:57:33 crc kubenswrapper[4962]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/project.openshift.io-projectcache ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/openshift.io-startinformers ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 20 09:57:33 crc kubenswrapper[4962]: livez check failed Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.515775 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" podUID="37e7b911-da73-4f82-ad0c-d8707547b7a7" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.517376 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" event={"ID":"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b","Type":"ContainerStarted","Data":"cbec25d696c6d5ff8e5be0198c3b52568e7c4618accc34f427c471862e581419"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.521879 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" event={"ID":"9c993e86-3068-4d07-84b3-655f8308b7ed","Type":"ContainerStarted","Data":"b980330296027c540ad89facdfc9f0e13573c200a01374ad863e804b57231972"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.522643 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7tj4j" event={"ID":"319cf696-9a12-40dc-9f4a-d80fab9a97f8","Type":"ContainerStarted","Data":"d2d3b1d0890dcecf34dcf952d3f23eda2c6d26d1faf52514c85aa14b5ab28740"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.528982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" event={"ID":"b55a13cf-03c6-46d9-b286-960a839b1558","Type":"ContainerStarted","Data":"e3edf1e340c556e0c6a8d4c9a2675f28f20603127816955312d7521dd57cf9a4"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.533040 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-tv8j9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.533094 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tv8j9" podUID="7ef2d9f9-34f2-48a6-83eb-689c0fdcac66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.535004 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" event={"ID":"9e86193f-b3bb-42a8-bccb-00e0cbcbf432","Type":"ContainerStarted","Data":"08df1ea0ff1879cfe156ee1a1acd0b7b9313f8fdf8c1bda0d53f7d3ee0a797f1"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.554308 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" podStartSLOduration=124.554285897 podStartE2EDuration="2m4.554285897s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:33.552922534 +0000 UTC m=+145.135394380" watchObservedRunningTime="2026-02-20 09:57:33.554285897 +0000 UTC m=+145.136757753" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.609543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.611388 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.111361996 +0000 UTC m=+145.693833842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.711763 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.712127 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.212113068 +0000 UTC m=+145.794584914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.848988 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.850216 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.350198352 +0000 UTC m=+145.932670188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.953723 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.955434 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.455415616 +0000 UTC m=+146.037887462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.961338 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.963174 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.963247 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.058299 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.059133 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.559117691 +0000 UTC m=+146.141589537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.159572 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.160428 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.660386469 +0000 UTC m=+146.242858315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.160633 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.160966 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.660951797 +0000 UTC m=+146.243423643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.262552 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.263825 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.763805036 +0000 UTC m=+146.346276882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.365363 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.365683 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.865669893 +0000 UTC m=+146.448141729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.471097 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.471377 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.97132795 +0000 UTC m=+146.553799796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.471857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.472171 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.972156776 +0000 UTC m=+146.554628622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.484733 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.577008 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.577740 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.07771632 +0000 UTC m=+146.660188166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.582353 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" event={"ID":"a3652dbd-dae4-462b-be88-b8a782de8a1c","Type":"ContainerStarted","Data":"e8531bc42f535f5fdb200b255f9b4197b26b17fb00311859ea4571ac343f8767"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.618624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" event={"ID":"77ff4d6a-8c1e-440f-a78c-900c09587848","Type":"ContainerStarted","Data":"5c1664ea6d1342cde3ee939a67b0d6fab903a6ff0752970679026e989c5b22da"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.630347 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" podStartSLOduration=125.630329278 podStartE2EDuration="2m5.630329278s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:34.628728496 +0000 UTC m=+146.211200342" watchObservedRunningTime="2026-02-20 09:57:34.630329278 +0000 UTC m=+146.212801114" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.658827 4962 generic.go:334] "Generic (PLEG): container finished" podID="0e4e18be-a43b-492a-981e-b4f9aebff1ab" containerID="30029afbf222fa2dec68b5070c70cd75e24e973f608bf3f26517175df7acbb18" exitCode=0 Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.658903 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" event={"ID":"0e4e18be-a43b-492a-981e-b4f9aebff1ab","Type":"ContainerDied","Data":"30029afbf222fa2dec68b5070c70cd75e24e973f608bf3f26517175df7acbb18"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.685447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.685766 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.185754063 +0000 UTC m=+146.768225909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.704192 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" event={"ID":"3721fc4d-6f04-458e-a74c-0fe816908414","Type":"ContainerStarted","Data":"2bdfa323009c4f4a85d15bb613412a29a700169e00232fe659c46b40638f9a93"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.725163 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7tj4j" event={"ID":"319cf696-9a12-40dc-9f4a-d80fab9a97f8","Type":"ContainerStarted","Data":"815f1756cdaba49e706bd4304aad70e20e24ade6e7f969af3b41c9f3d5767389"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.731944 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwfk6" event={"ID":"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a","Type":"ContainerStarted","Data":"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.772634 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" event={"ID":"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800","Type":"ContainerStarted","Data":"6847c2056385d4b2dd02a42a9a9d446c50c08eafa2f1e6cc3d0c3558968bff47"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.772997 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.783109 4962 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-g6nc2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.783169 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" podUID="bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.786667 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.787315 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.287294561 +0000 UTC m=+146.869766407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.788384 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" event={"ID":"474b1e5d-9a6f-4931-be66-8fb20c82ac60","Type":"ContainerStarted","Data":"fd8fc214b400152a5477878bb64f315d0e48dee95f1111925643f740c4d287b9"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.790795 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" event={"ID":"b55a13cf-03c6-46d9-b286-960a839b1558","Type":"ContainerStarted","Data":"a2337c63c0d070eb3f42eff63f1912b6c6c0c51643c6c305ca770b392ac7df8a"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.792068 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" event={"ID":"c3febff6-f15f-4ce8-825c-37d86b13c56d","Type":"ContainerStarted","Data":"57b3e3f27f28e048385a642c6e636092605d8051a0ecf5d01b844ae14fcbcd98"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.793013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" event={"ID":"daef1622-b612-4661-bb6a-63c5997d9a07","Type":"ContainerStarted","Data":"623f5a83713eae9ae66915a5f02b66fb1c906e36ace1bfe89b27459ea8063b58"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.793635 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.803715 4962 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rsf8j container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.803774 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" podUID="daef1622-b612-4661-bb6a-63c5997d9a07" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.822817 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" event={"ID":"0240e440-4be2-4607-99c4-636b65e78081","Type":"ContainerStarted","Data":"2fd6ed07b48189abc8523e249d3b188bdfe101528afc68686b75589efb8e5e19"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.835402 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" event={"ID":"bfd57a5c-0892-46a0-8005-0a8f70c146fd","Type":"ContainerStarted","Data":"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.836258 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.844694 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m7z5r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.844725 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.846781 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" event={"ID":"a90b20e7-a8bc-4b8d-b407-f4f31fc96528","Type":"ContainerStarted","Data":"a99251a9a36d63a1de6dacdab4d78da084782809d03e524d2845e8f4dddfee56"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.865417 4962 generic.go:334] "Generic (PLEG): container finished" podID="cba11394-4e55-4edc-beec-750bddabc1d0" containerID="0bca8520e1be4b3ef76bf6c2a482d5c0ab095f3e2903c5da2f7a07119447c61a" exitCode=0 Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.865510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" event={"ID":"cba11394-4e55-4edc-beec-750bddabc1d0","Type":"ContainerDied","Data":"0bca8520e1be4b3ef76bf6c2a482d5c0ab095f3e2903c5da2f7a07119447c61a"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.887840 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nwfk6" podStartSLOduration=125.887818945 podStartE2EDuration="2m5.887818945s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:34.881799164 +0000 UTC m=+146.464271010" watchObservedRunningTime="2026-02-20 09:57:34.887818945 +0000 UTC m=+146.470290791" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.888044 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" podStartSLOduration=124.888039612 podStartE2EDuration="2m4.888039612s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:34.768380271 +0000 UTC m=+146.350852117" watchObservedRunningTime="2026-02-20 09:57:34.888039612 +0000 UTC m=+146.470511458" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.888701 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.890563 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" event={"ID":"34f42578-fcc9-4539-add3-bca8deb6927b","Type":"ContainerStarted","Data":"9723cce7b81e12ca5ae6a0cf324a742ef329141e85ac2bbb0e1234b2b48301c9"} Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.891856 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.391842662 +0000 UTC m=+146.974314508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.914235 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7tj4j" podStartSLOduration=7.91420019 podStartE2EDuration="7.91420019s" podCreationTimestamp="2026-02-20 09:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:34.912745305 +0000 UTC m=+146.495217151" watchObservedRunningTime="2026-02-20 09:57:34.91420019 +0000 UTC m=+146.496672036" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.920571 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" event={"ID":"7c85c4ba-4bcb-4449-bd63-320f2ff6a116","Type":"ContainerStarted","Data":"44b500957f3dd40a8c062d7793bb38ee7f4e90c96eec1518e1be022170a3ed13"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.939807 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" event={"ID":"f8161f87-3814-4d02-84ff-b94b8b05c59e","Type":"ContainerStarted","Data":"7cda19df238dbbc1faa67326833d89d8ee6218b1c4e8291b2668f05c3b4f21bf"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.940831 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.947948 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" event={"ID":"9c993e86-3068-4d07-84b3-655f8308b7ed","Type":"ContainerStarted","Data":"5e4ea3b700baf309053aa9ba3f2593cde7c92ddcbb20f78cab4d07dec31a031e"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.949027 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" event={"ID":"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1","Type":"ContainerStarted","Data":"fde666550b488c9bfb71c2f2d3ec18c4adb04ae0eed789c39e4bfeac488c04fb"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.951043 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-l92fq" event={"ID":"32025b2b-9232-449f-b7bc-582d81d76430","Type":"ContainerStarted","Data":"782c84f5d1345ec0f04dc5e87e15446209ef1f59e146f06988c620f04d7c356e"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.954261 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" podStartSLOduration=125.954243319 podStartE2EDuration="2m5.954243319s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:34.946007108 +0000 UTC m=+146.528478954" watchObservedRunningTime="2026-02-20 09:57:34.954243319 +0000 UTC m=+146.536715165" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.961378 4962 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-mrzbm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.961436 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.004787 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.017701 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:35 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:35 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:35 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.017785 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.018792 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.518774304 +0000 UTC m=+147.101246150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.051255 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" podStartSLOduration=125.051229821 podStartE2EDuration="2m5.051229821s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.003617273 +0000 UTC m=+146.586089119" watchObservedRunningTime="2026-02-20 09:57:35.051229821 +0000 UTC m=+146.633701667" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.079265 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" podStartSLOduration=125.079228948 podStartE2EDuration="2m5.079228948s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.049519338 +0000 UTC m=+146.631991194" watchObservedRunningTime="2026-02-20 09:57:35.079228948 +0000 UTC m=+146.661700794" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.102723 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" podStartSLOduration=125.102699472 podStartE2EDuration="2m5.102699472s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.101021989 +0000 UTC m=+146.683493835" watchObservedRunningTime="2026-02-20 09:57:35.102699472 +0000 UTC m=+146.685171318" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.132246 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.138047 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.638023691 +0000 UTC m=+147.220495757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.151843 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" podStartSLOduration=126.151802828 podStartE2EDuration="2m6.151802828s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.149005289 +0000 UTC m=+146.731477135" watchObservedRunningTime="2026-02-20 09:57:35.151802828 +0000 UTC m=+146.734274674" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.194562 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" podStartSLOduration=125.194538402 podStartE2EDuration="2m5.194538402s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.186360413 +0000 UTC m=+146.768832259" watchObservedRunningTime="2026-02-20 09:57:35.194538402 +0000 UTC m=+146.777010248" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.199658 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.200460 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.200662 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.204007 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.204043 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.223079 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" podStartSLOduration=125.223035075 podStartE2EDuration="2m5.223035075s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.219452381 +0000 UTC m=+146.801924227" watchObservedRunningTime="2026-02-20 09:57:35.223035075 +0000 UTC m=+146.805506921" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.233261 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.233686 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.733669821 +0000 UTC m=+147.316141667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.333450 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" podStartSLOduration=125.333421051 podStartE2EDuration="2m5.333421051s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.279970869 +0000 UTC m=+146.862442715" watchObservedRunningTime="2026-02-20 09:57:35.333421051 +0000 UTC m=+146.915892897" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.336502 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.336538 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.336613 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.337633 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.837616725 +0000 UTC m=+147.420088571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.383300 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" podStartSLOduration=126.383275001 podStartE2EDuration="2m6.383275001s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.33809508 +0000 UTC m=+146.920566926" watchObservedRunningTime="2026-02-20 09:57:35.383275001 +0000 UTC m=+146.965746847" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.409939 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" podStartSLOduration=126.409914655 podStartE2EDuration="2m6.409914655s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.385163761 +0000 UTC m=+146.967635607" watchObservedRunningTime="2026-02-20 09:57:35.409914655 +0000 UTC m=+146.992386501" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.443164 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" podStartSLOduration=126.443145808 podStartE2EDuration="2m6.443145808s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.441910269 +0000 UTC m=+147.024382115" watchObservedRunningTime="2026-02-20 09:57:35.443145808 +0000 UTC m=+147.025617654" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.443308 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.443518 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.443548 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.443766 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.443857 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.94383886 +0000 UTC m=+147.526310706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.471654 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.471928 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-l92fq" podStartSLOduration=8.471902919 podStartE2EDuration="8.471902919s" podCreationTimestamp="2026-02-20 09:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.466312272 +0000 UTC m=+147.048784118" watchObservedRunningTime="2026-02-20 09:57:35.471902919 +0000 UTC m=+147.054374765" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.542174 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.545367 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.545850 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.045826701 +0000 UTC m=+147.628298717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.646702 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.647008 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.146957324 +0000 UTC m=+147.729429170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.647850 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.648283 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.148263166 +0000 UTC m=+147.730735012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.748803 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.749059 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.249012168 +0000 UTC m=+147.831484014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.749153 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.749687 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.249671469 +0000 UTC m=+147.832143315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.850633 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.851069 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.351047081 +0000 UTC m=+147.933518927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.952280 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.952793 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.452768233 +0000 UTC m=+148.035240069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.965888 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:35 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:35 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:35 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.965985 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.978541 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.979254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" event={"ID":"8cb06d17-6188-4cca-84b7-f3d03abb20e8","Type":"ContainerStarted","Data":"2d835a5c085a62d70bc8de6c2cbbf96b10458ff46ab6c5eb78e948e87cbfd48a"} Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.983332 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" event={"ID":"3721fc4d-6f04-458e-a74c-0fe816908414","Type":"ContainerStarted","Data":"fc0a665ba0187f4519015f863c75c4e7b1fc1d7e9d8e5acbdc182b234426711b"} Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.986008 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" event={"ID":"1d0fd4e8-ba15-4d2f-9602-e887819ea423","Type":"ContainerStarted","Data":"f52f197f1e495c3db1e3057530f87d97247fe176ad48fc7d93be58c3826b76cc"} Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.990821 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" event={"ID":"5df91f4a-70e8-4036-8ab1-d917af6c8aa4","Type":"ContainerStarted","Data":"8507e85e72f1b098bf2eb294ee7cb575a3e07472243bb92e708a75b27f238ef6"} Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.991668 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.994981 4962 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-965lm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.995030 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" podUID="5df91f4a-70e8-4036-8ab1-d917af6c8aa4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.998970 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" event={"ID":"cba11394-4e55-4edc-beec-750bddabc1d0","Type":"ContainerStarted","Data":"bf64eb1191de2992ecf1fe7024d9c2fb8f001c7596f96f6b6660c08cf585d672"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.008333 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4mw9f" event={"ID":"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c","Type":"ContainerStarted","Data":"9c23099cf62e1856118f9e256d60bc14a4eaabb786daaaefefdfbc09d7272bed"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.008374 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4mw9f" event={"ID":"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c","Type":"ContainerStarted","Data":"c8b290a446b1a0ff68897fb714b64c2c886fd7c064948346afa4d53169d5090b"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.008777 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.013162 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" podStartSLOduration=127.013142316 podStartE2EDuration="2m7.013142316s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.012374261 +0000 UTC m=+147.594846117" watchObservedRunningTime="2026-02-20 09:57:36.013142316 +0000 UTC m=+147.595614162" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.015430 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" event={"ID":"c3febff6-f15f-4ce8-825c-37d86b13c56d","Type":"ContainerStarted","Data":"26c1d96111b69416936f5ec08aca7b6b3d82f231dcc1fb41a30595d176a2b78a"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.030520 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" event={"ID":"a7a9fa76-da75-4847-a539-d1e6bb57da98","Type":"ContainerStarted","Data":"1a98b0ae0814bfea7cc97db524565b02c5a34c29f92af1c6afdbfecf632676a2"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.030621 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" event={"ID":"a7a9fa76-da75-4847-a539-d1e6bb57da98","Type":"ContainerStarted","Data":"869b047e6d8c83815379af5cc852e044730609a76341d85f68e1eaacce78d4fd"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.035535 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" podStartSLOduration=126.035519245 podStartE2EDuration="2m6.035519245s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.034814972 +0000 UTC m=+147.617286818" watchObservedRunningTime="2026-02-20 09:57:36.035519245 +0000 UTC m=+147.617991091" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.039967 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" event={"ID":"674f40ed-74ed-48c2-8036-087ce9e16c94","Type":"ContainerStarted","Data":"9cffb7dc2af2b1a98935ee59a85ac9e57c1c54d7fb4bb9a7c80dd05dece660ce"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.043639 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" event={"ID":"77ff4d6a-8c1e-440f-a78c-900c09587848","Type":"ContainerStarted","Data":"90810cb9fd7bdb3cb8ae0a1a0443445cc9e7ecae14c9e37dea5779b65a7767d0"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.054153 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.054378 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.554337611 +0000 UTC m=+148.136809457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.054459 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.054631 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.054861 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.054902 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.057926 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.557908984 +0000 UTC m=+148.140380830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.059273 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.063767 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" event={"ID":"9c993e86-3068-4d07-84b3-655f8308b7ed","Type":"ContainerStarted","Data":"51741528a20919ae4db90e7b195e0c13fb3633bd047b993855bd1644447145bc"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.064852 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.064880 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.065161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.070747 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.071236 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4mw9f" podStartSLOduration=9.071222576 podStartE2EDuration="9.071222576s" podCreationTimestamp="2026-02-20 09:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.07072203 +0000 UTC m=+147.653193876" watchObservedRunningTime="2026-02-20 09:57:36.071222576 +0000 UTC m=+147.653694422" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.079357 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" event={"ID":"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b","Type":"ContainerStarted","Data":"6df0b4009565b100e9117e0b6f2259dcd8593ad7d6d9ab74f016f684565c1aac"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.099760 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" event={"ID":"0e4e18be-a43b-492a-981e-b4f9aebff1ab","Type":"ContainerStarted","Data":"9bc76df3e93c4ee55a39198dbe10bbace1254f7e66e141cd72eacdb0c8f0ebe4"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.102329 4962 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rsf8j container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.102382 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" podUID="daef1622-b612-4661-bb6a-63c5997d9a07" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.102638 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m7z5r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.102721 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.103696 4962 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-g6nc2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.103763 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" podUID="bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.114574 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" podStartSLOduration=126.114544458 podStartE2EDuration="2m6.114544458s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.112207234 +0000 UTC m=+147.694679090" watchObservedRunningTime="2026-02-20 09:57:36.114544458 +0000 UTC m=+147.697016304" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.156044 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.156531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.160611 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.660573817 +0000 UTC m=+148.243045663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.187498 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" podStartSLOduration=126.187480279 podStartE2EDuration="2m6.187480279s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.171305296 +0000 UTC m=+147.753777142" watchObservedRunningTime="2026-02-20 09:57:36.187480279 +0000 UTC m=+147.769952125" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.188183 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.200783 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" podStartSLOduration=127.200710258 podStartE2EDuration="2m7.200710258s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.13921106 +0000 UTC m=+147.721682906" watchObservedRunningTime="2026-02-20 09:57:36.200710258 +0000 UTC m=+147.783182114" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.230209 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" podStartSLOduration=126.230190962 podStartE2EDuration="2m6.230190962s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.230017276 +0000 UTC m=+147.812489122" watchObservedRunningTime="2026-02-20 09:57:36.230190962 +0000 UTC m=+147.812662808" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.253332 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" podStartSLOduration=126.253311235 podStartE2EDuration="2m6.253311235s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.250041821 +0000 UTC m=+147.832513667" watchObservedRunningTime="2026-02-20 09:57:36.253311235 +0000 UTC m=+147.835783081" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.259224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.266258 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.766240284 +0000 UTC m=+148.348712130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.281304 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" podStartSLOduration=127.28127136 podStartE2EDuration="2m7.28127136s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.28126306 +0000 UTC m=+147.863734906" watchObservedRunningTime="2026-02-20 09:57:36.28127136 +0000 UTC m=+147.863743206" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.393697 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.400182 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.400376 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.400816 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.900794487 +0000 UTC m=+148.483266333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.460064 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" podStartSLOduration=126.460047484 podStartE2EDuration="2m6.460047484s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.332002558 +0000 UTC m=+147.914474404" watchObservedRunningTime="2026-02-20 09:57:36.460047484 +0000 UTC m=+148.042519330" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.477063 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.502368 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.502885 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.002868641 +0000 UTC m=+148.585340487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.580027 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.604578 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.604784 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.104742008 +0000 UTC m=+148.687213854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.604902 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.605225 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.105190642 +0000 UTC m=+148.687662488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.649268 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" podStartSLOduration=127.649249298 podStartE2EDuration="2m7.649249298s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.463313458 +0000 UTC m=+148.045785304" watchObservedRunningTime="2026-02-20 09:57:36.649249298 +0000 UTC m=+148.231721144" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.706431 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.706835 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.206820452 +0000 UTC m=+148.789292298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.808162 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.808622 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.308586767 +0000 UTC m=+148.891058613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.908939 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.909129 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.409101651 +0000 UTC m=+148.991573497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.909796 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.910190 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.410181685 +0000 UTC m=+148.992653591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.970287 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:36 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:36 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:36 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.970343 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.997297 4962 csr.go:261] certificate signing request csr-pmnxg is approved, waiting to be issued Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.011644 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.012136 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.512089393 +0000 UTC m=+149.094561239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.032555 4962 csr.go:257] certificate signing request csr-pmnxg is issued Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.127444 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.128173 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.62815989 +0000 UTC m=+149.210631736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.230000 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.230186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9c04c443e027d8924547d85ee062566809140b12e02a27ec77d3e52915a7c22e"} Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.231314 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.731289748 +0000 UTC m=+149.313761594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.281200 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a07ea40e-b9be-4e90-bf7a-293fa009e7d2","Type":"ContainerStarted","Data":"22f0d08c036c0314545671011b2622d5765c7a6621317839ad590b2d4b0ed9d6"} Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.281263 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a07ea40e-b9be-4e90-bf7a-293fa009e7d2","Type":"ContainerStarted","Data":"bd7d4127579c5aa2ce40f267a04e89a7814800df62a8bbe88b691f551a516f6e"} Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.331224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.331581 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.831568945 +0000 UTC m=+149.414040791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.355700 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" event={"ID":"8cb06d17-6188-4cca-84b7-f3d03abb20e8","Type":"ContainerStarted","Data":"d55b0414e4e28d7057524b2c28fe19f6a5e9804ff735c31b1c129a4e9cad1b7a"} Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.363706 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m7z5r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.363752 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.438876 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.439453 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.939426751 +0000 UTC m=+149.521898597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.461738 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:37 crc kubenswrapper[4962]: W0220 09:57:37.485764 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-23bfa3bade7d80bd6fd7307b1c8515ad0829a51431bd2f2e7505b2fb4105c306 WatchSource:0}: Error finding container 23bfa3bade7d80bd6fd7307b1c8515ad0829a51431bd2f2e7505b2fb4105c306: Status 404 returned error can't find the container with id 23bfa3bade7d80bd6fd7307b1c8515ad0829a51431bd2f2e7505b2fb4105c306 Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.551949 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.557180 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.057163181 +0000 UTC m=+149.639635027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.641481 4962 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.668217 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.668509 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.168492798 +0000 UTC m=+149.750964644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.770531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.771672 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.271646947 +0000 UTC m=+149.854118793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.872574 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.872852 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.372838442 +0000 UTC m=+149.955310288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.963603 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:37 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:37 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:37 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.963672 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.973825 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.974148 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.474136741 +0000 UTC m=+150.056608587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.034558 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-20 09:52:36 +0000 UTC, rotation deadline is 2026-12-05 20:03:26.332715871 +0000 UTC Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.034631 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6922h5m48.298088693s for next certificate rotation Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.075077 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:38 crc kubenswrapper[4962]: E0220 09:57:38.075265 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.575234834 +0000 UTC m=+150.157706680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.075444 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: E0220 09:57:38.075893 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.575876075 +0000 UTC m=+150.158347921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.176148 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.176643 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:38 crc kubenswrapper[4962]: E0220 09:57:38.177099 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.677075651 +0000 UTC m=+150.259547497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.258368 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.259402 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.265231 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.272930 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.278168 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqtxr\" (UniqueName: \"kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.278315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.278412 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.278577 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: E0220 09:57:38.278703 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.77868617 +0000 UTC m=+150.361158016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.321696 4962 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-20T09:57:37.641513674Z","Handler":null,"Name":""} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.329747 4962 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.329813 4962 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.371865 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cef9b30fae3cc65222f2b69910474ee9d3e9c6671f9d9b0373e397e9a6c203c7"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.371924 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"23bfa3bade7d80bd6fd7307b1c8515ad0829a51431bd2f2e7505b2fb4105c306"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.372329 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.373737 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5077da1ec905ac4d8b00fc750f998dc9b6753887e8cab94c7a76cf4dafa7d4c2"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.375365 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1f363d461cf8561be841943af5ab4b48d9c3b3fed020a6cc20ea47dad7c65227"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.375427 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"382532b5ac93bf8cd62633a3dac7df6ef3102e6029d7bb44b9438f0797fd2596"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.376995 4962 generic.go:334] "Generic (PLEG): container finished" podID="a07ea40e-b9be-4e90-bf7a-293fa009e7d2" containerID="22f0d08c036c0314545671011b2622d5765c7a6621317839ad590b2d4b0ed9d6" exitCode=0 Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.377337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a07ea40e-b9be-4e90-bf7a-293fa009e7d2","Type":"ContainerDied","Data":"22f0d08c036c0314545671011b2622d5765c7a6621317839ad590b2d4b0ed9d6"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.379348 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.379771 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.379910 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqtxr\" (UniqueName: \"kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.380268 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.380423 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.380732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.381155 4962 generic.go:334] "Generic (PLEG): container finished" podID="a3652dbd-dae4-462b-be88-b8a782de8a1c" containerID="e8531bc42f535f5fdb200b255f9b4197b26b17fb00311859ea4571ac343f8767" exitCode=0 Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.381219 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" event={"ID":"a3652dbd-dae4-462b-be88-b8a782de8a1c","Type":"ContainerDied","Data":"e8531bc42f535f5fdb200b255f9b4197b26b17fb00311859ea4571ac343f8767"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.384875 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.388196 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" event={"ID":"8cb06d17-6188-4cca-84b7-f3d03abb20e8","Type":"ContainerStarted","Data":"03dc8ce942a5ba94e89de4d324b99e17a76a86be71aae629cd6916c027f856c5"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.388323 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" event={"ID":"8cb06d17-6188-4cca-84b7-f3d03abb20e8","Type":"ContainerStarted","Data":"b37495346e5f2a55c17afe8dc8f3791bc5cb2e2082f2276176b213cf87971aa8"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.397665 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.402233 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqtxr\" (UniqueName: \"kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.461771 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.462808 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.465890 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.476829 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.481916 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.482032 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.482227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspvg\" (UniqueName: \"kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.482284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.505642 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.505724 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.511705 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.522025 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.532314 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" podStartSLOduration=11.532281494 podStartE2EDuration="11.532281494s" podCreationTimestamp="2026-02-20 09:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:38.528233015 +0000 UTC m=+150.110704851" watchObservedRunningTime="2026-02-20 09:57:38.532281494 +0000 UTC m=+150.114753340" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.579178 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.584526 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vspvg\" (UniqueName: \"kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.584582 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.584722 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.586924 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.587494 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.634467 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspvg\" (UniqueName: \"kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.636850 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.681741 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.682985 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.721775 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.729047 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.779867 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.793558 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.793604 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dszmr\" (UniqueName: \"kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.793670 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.864715 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.865975 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.891363 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895196 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895256 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58dmf\" (UniqueName: \"kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895312 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895341 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895360 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dszmr\" (UniqueName: \"kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895393 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.896231 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.896469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.934567 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dszmr\" (UniqueName: \"kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.973843 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:38 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:38 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:38 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.973894 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.996705 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.996778 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58dmf\" (UniqueName: \"kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.996827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.997535 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.997792 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.000254 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.017599 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58dmf\" (UniqueName: \"kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.023700 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.097493 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access\") pod \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.097924 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir\") pod \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.099175 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a07ea40e-b9be-4e90-bf7a-293fa009e7d2" (UID: "a07ea40e-b9be-4e90-bf7a-293fa009e7d2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.105665 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a07ea40e-b9be-4e90-bf7a-293fa009e7d2" (UID: "a07ea40e-b9be-4e90-bf7a-293fa009e7d2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.139606 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 09:57:39 crc kubenswrapper[4962]: W0220 09:57:39.170150 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee660135_f5e2_420e_a242_440471e57da2.slice/crio-ca9165de15eb6be88321d9382abfd900310af2761fe4b1a318a48f4e2a654377 WatchSource:0}: Error finding container ca9165de15eb6be88321d9382abfd900310af2761fe4b1a318a48f4e2a654377: Status 404 returned error can't find the container with id ca9165de15eb6be88321d9382abfd900310af2761fe4b1a318a48f4e2a654377 Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.172773 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.200499 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.200537 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.265446 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.295003 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.313177 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.406461 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.422872 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" event={"ID":"b4ad1819-20e1-406b-8499-5a73780c0a0c","Type":"ContainerStarted","Data":"5004d974da71f7174ba7d6f42652143c4f7cb0b752e3647e653cb9e55b56d9b3"} Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.428161 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.428295 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a07ea40e-b9be-4e90-bf7a-293fa009e7d2","Type":"ContainerDied","Data":"bd7d4127579c5aa2ce40f267a04e89a7814800df62a8bbe88b691f551a516f6e"} Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.428367 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd7d4127579c5aa2ce40f267a04e89a7814800df62a8bbe88b691f551a516f6e" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.432871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerStarted","Data":"dd70ef6c640a62edc318879e7e0b88b18026337e7b55ef136a0601bdad9e609c"} Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.435926 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerStarted","Data":"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c"} Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.435969 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerStarted","Data":"ca9165de15eb6be88321d9382abfd900310af2761fe4b1a318a48f4e2a654377"} Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.600743 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-tv8j9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.600875 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tv8j9" podUID="7ef2d9f9-34f2-48a6-83eb-689c0fdcac66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.604730 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-tv8j9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.604757 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tv8j9" podUID="7ef2d9f9-34f2-48a6-83eb-689c0fdcac66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.690766 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.694792 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.713519 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume\") pod \"a3652dbd-dae4-462b-be88-b8a782de8a1c\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.713624 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcnb5\" (UniqueName: \"kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5\") pod \"a3652dbd-dae4-462b-be88-b8a782de8a1c\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.713651 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume\") pod \"a3652dbd-dae4-462b-be88-b8a782de8a1c\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.716271 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume" (OuterVolumeSpecName: "config-volume") pod "a3652dbd-dae4-462b-be88-b8a782de8a1c" (UID: "a3652dbd-dae4-462b-be88-b8a782de8a1c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.723622 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a3652dbd-dae4-462b-be88-b8a782de8a1c" (UID: "a3652dbd-dae4-462b-be88-b8a782de8a1c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.724794 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5" (OuterVolumeSpecName: "kube-api-access-hcnb5") pod "a3652dbd-dae4-462b-be88-b8a782de8a1c" (UID: "a3652dbd-dae4-462b-be88-b8a782de8a1c"). InnerVolumeSpecName "kube-api-access-hcnb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.817328 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcnb5\" (UniqueName: \"kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.817375 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.817390 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.962241 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.962302 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.964039 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:39 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:39 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:39 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.964085 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.969420 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.261663 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 09:57:40 crc kubenswrapper[4962]: E0220 09:57:40.262023 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3652dbd-dae4-462b-be88-b8a782de8a1c" containerName="collect-profiles" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.262043 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3652dbd-dae4-462b-be88-b8a782de8a1c" containerName="collect-profiles" Feb 20 09:57:40 crc kubenswrapper[4962]: E0220 09:57:40.262058 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07ea40e-b9be-4e90-bf7a-293fa009e7d2" containerName="pruner" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.262070 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07ea40e-b9be-4e90-bf7a-293fa009e7d2" containerName="pruner" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.262230 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a07ea40e-b9be-4e90-bf7a-293fa009e7d2" containerName="pruner" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.262249 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3652dbd-dae4-462b-be88-b8a782de8a1c" containerName="collect-profiles" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.263436 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.269692 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.277570 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.424259 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.424334 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxf6\" (UniqueName: \"kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.424359 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.443046 4962 generic.go:334] "Generic (PLEG): container finished" podID="805f4075-7fda-4a54-882f-c4fd160148a4" containerID="d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4" exitCode=0 Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.443096 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerDied","Data":"d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.443153 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerStarted","Data":"75a64fa0799e34c46374b381b4c7c1a53295cecd1a95e7229c96eb57af23d670"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.445204 4962 generic.go:334] "Generic (PLEG): container finished" podID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerID="9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c" exitCode=0 Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.445255 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerDied","Data":"9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.445750 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.446971 4962 generic.go:334] "Generic (PLEG): container finished" podID="1c487f78-6735-4114-a45a-6c60ccef5983" containerID="43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25" exitCode=0 Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.447062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerDied","Data":"43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.447510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerStarted","Data":"7fcd5743db242f51c1ff9cb31c18720e064f45718743b615c36cf8bb2d39f79d"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.451105 4962 generic.go:334] "Generic (PLEG): container finished" podID="ee660135-f5e2-420e-a242-440471e57da2" containerID="eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c" exitCode=0 Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.451203 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerDied","Data":"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.459709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" event={"ID":"a3652dbd-dae4-462b-be88-b8a782de8a1c","Type":"ContainerDied","Data":"9cb82e35bd3f3d4fb0aaea07b6c95b315a4035aa82123595066c03e2ec02bbc3"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.459761 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb82e35bd3f3d4fb0aaea07b6c95b315a4035aa82123595066c03e2ec02bbc3" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.459844 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.462935 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" event={"ID":"b4ad1819-20e1-406b-8499-5a73780c0a0c","Type":"ContainerStarted","Data":"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.463113 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.470019 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.497470 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" podStartSLOduration=131.497441552 podStartE2EDuration="2m11.497441552s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:40.484386819 +0000 UTC m=+152.066858665" watchObservedRunningTime="2026-02-20 09:57:40.497441552 +0000 UTC m=+152.079913408" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.525596 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.525704 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxf6\" (UniqueName: \"kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.525804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.526360 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.526429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.544443 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxf6\" (UniqueName: \"kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.579300 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.583665 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.583716 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.585954 4962 patch_prober.go:28] interesting pod/console-f9d7485db-nwfk6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.586000 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nwfk6" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.651596 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.653356 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.666297 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.716921 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.729410 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.840252 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvgwq\" (UniqueName: \"kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.840627 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.840715 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.941623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.941733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvgwq\" (UniqueName: \"kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.941755 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.942208 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.942623 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.960600 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.970856 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:40 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:40 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:40 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.970948 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.992833 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvgwq\" (UniqueName: \"kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.045080 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.275436 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.456187 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.457847 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.460565 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.473536 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.484764 4962 generic.go:334] "Generic (PLEG): container finished" podID="2f414667-865d-4c89-b470-50f61a11b60e" containerID="4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf" exitCode=0 Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.486664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerDied","Data":"4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf"} Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.486696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerStarted","Data":"4ba52fa324168e2ee08b42cdefbfc041b14744aa5c09a51cbc5628b6f08e9f57"} Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.516971 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.517021 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.657209 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpnrc\" (UniqueName: \"kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.658163 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.658249 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.760111 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpnrc\" (UniqueName: \"kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.760202 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.760242 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.760999 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.761761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.781690 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpnrc\" (UniqueName: \"kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.864831 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.867501 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.872899 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.885734 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.958105 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.963508 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w94dl\" (UniqueName: \"kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.963636 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.963769 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.965342 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.968051 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.065498 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.065568 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w94dl\" (UniqueName: \"kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.065588 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.066020 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.066271 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.087994 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w94dl\" (UniqueName: \"kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.195914 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.259665 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.485017 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.494406 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerStarted","Data":"1d342c2b45fca81302a1d91ac4539bc994004d9b4834928e5a7a0d50c28cc22c"} Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.496699 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerStarted","Data":"962bcd42638f9814f3627f1f0129094057257d45837d02365bb3acaa7e0e1287"} Feb 20 09:57:42 crc kubenswrapper[4962]: W0220 09:57:42.499498 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4065ac08_9c62_48db_bbfe_9e53ab7d5463.slice/crio-50d62aa547d31448aec488d9bebb813e5df408a7d3796523fe1332f39c82e601 WatchSource:0}: Error finding container 50d62aa547d31448aec488d9bebb813e5df408a7d3796523fe1332f39c82e601: Status 404 returned error can't find the container with id 50d62aa547d31448aec488d9bebb813e5df408a7d3796523fe1332f39c82e601 Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.176078 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.177180 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.181496 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.181921 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.186762 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.289819 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.289898 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.391546 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.391648 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.391783 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.413584 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.514549 4962 generic.go:334] "Generic (PLEG): container finished" podID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerID="7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1" exitCode=0 Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.514645 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerDied","Data":"7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1"} Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.522163 4962 generic.go:334] "Generic (PLEG): container finished" podID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerID="8158f54e7f263dc473b3aa7c7a0204ebc0dcdf73eaeb229d1a5efae6eb63d973" exitCode=0 Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.522228 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerDied","Data":"8158f54e7f263dc473b3aa7c7a0204ebc0dcdf73eaeb229d1a5efae6eb63d973"} Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.522254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerStarted","Data":"50d62aa547d31448aec488d9bebb813e5df408a7d3796523fe1332f39c82e601"} Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.526350 4962 generic.go:334] "Generic (PLEG): container finished" podID="a313fb19-8615-43b7-a19a-df83e50410ba" containerID="67d79612188b2d4ac629f49494e406f3c95c85018ab3966df8e32115cfac1740" exitCode=0 Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.526394 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerDied","Data":"67d79612188b2d4ac629f49494e406f3c95c85018ab3966df8e32115cfac1740"} Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.538306 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.885148 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 09:57:43 crc kubenswrapper[4962]: W0220 09:57:43.905975 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4d9655a2_2b08_4827_8126_160f62910b6f.slice/crio-4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa WatchSource:0}: Error finding container 4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa: Status 404 returned error can't find the container with id 4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa Feb 20 09:57:44 crc kubenswrapper[4962]: I0220 09:57:44.543487 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d9655a2-2b08-4827-8126-160f62910b6f","Type":"ContainerStarted","Data":"4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa"} Feb 20 09:57:45 crc kubenswrapper[4962]: I0220 09:57:45.558010 4962 generic.go:334] "Generic (PLEG): container finished" podID="4d9655a2-2b08-4827-8126-160f62910b6f" containerID="919e1eeced82919b575c925e5979fd4022daa4fe766b1bf035955e1bec3ef962" exitCode=0 Feb 20 09:57:45 crc kubenswrapper[4962]: I0220 09:57:45.558112 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d9655a2-2b08-4827-8126-160f62910b6f","Type":"ContainerDied","Data":"919e1eeced82919b575c925e5979fd4022daa4fe766b1bf035955e1bec3ef962"} Feb 20 09:57:45 crc kubenswrapper[4962]: I0220 09:57:45.768127 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:49 crc kubenswrapper[4962]: I0220 09:57:49.609104 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:50 crc kubenswrapper[4962]: I0220 09:57:50.618750 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:50 crc kubenswrapper[4962]: I0220 09:57:50.627375 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:51 crc kubenswrapper[4962]: I0220 09:57:51.759469 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:51 crc kubenswrapper[4962]: I0220 09:57:51.767392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:52 crc kubenswrapper[4962]: I0220 09:57:52.058939 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:58 crc kubenswrapper[4962]: I0220 09:57:58.740565 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:58 crc kubenswrapper[4962]: I0220 09:57:58.992673 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.076365 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access\") pod \"4d9655a2-2b08-4827-8126-160f62910b6f\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.076476 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir\") pod \"4d9655a2-2b08-4827-8126-160f62910b6f\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.076663 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4d9655a2-2b08-4827-8126-160f62910b6f" (UID: "4d9655a2-2b08-4827-8126-160f62910b6f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.076920 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.081839 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4d9655a2-2b08-4827-8126-160f62910b6f" (UID: "4d9655a2-2b08-4827-8126-160f62910b6f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.178637 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.715917 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d9655a2-2b08-4827-8126-160f62910b6f","Type":"ContainerDied","Data":"4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa"} Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.715971 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.716001 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:58:10 crc kubenswrapper[4962]: I0220 09:58:10.707975 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:58:11 crc kubenswrapper[4962]: I0220 09:58:11.508644 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:58:11 crc kubenswrapper[4962]: I0220 09:58:11.508719 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.025039 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.025754 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58dmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-d4pzn_openshift-marketplace(1c487f78-6735-4114-a45a-6c60ccef5983): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.027216 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-d4pzn" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.028311 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.028529 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vspvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6q5bk_openshift-marketplace(e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.029748 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6q5bk" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.178672 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.180008 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dszmr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5qdmt_openshift-marketplace(805f4075-7fda-4a54-882f-c4fd160148a4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.181131 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5qdmt" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.323224 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5bwk2"] Feb 20 09:58:12 crc kubenswrapper[4962]: W0220 09:58:12.332901 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd590527b_ed56_4fb4_a712_b09781618a76.slice/crio-23638ae3ed1265eaf40439d0fec35a2389a58acb09ed9f8dbb4e361c39e741c9 WatchSource:0}: Error finding container 23638ae3ed1265eaf40439d0fec35a2389a58acb09ed9f8dbb4e361c39e741c9: Status 404 returned error can't find the container with id 23638ae3ed1265eaf40439d0fec35a2389a58acb09ed9f8dbb4e361c39e741c9 Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.807560 4962 generic.go:334] "Generic (PLEG): container finished" podID="ee660135-f5e2-420e-a242-440471e57da2" containerID="8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8" exitCode=0 Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.807651 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerDied","Data":"8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.810690 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerStarted","Data":"077fd78654c04cefa692b832e5d39f0ad8125a113696468af2f8fd05f987f05c"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.814910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" event={"ID":"d590527b-ed56-4fb4-a712-b09781618a76","Type":"ContainerStarted","Data":"c2573cf807293be7622d57a2e5ba8fcbb04759292a41e15cc267e2a1185db35c"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.815058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" event={"ID":"d590527b-ed56-4fb4-a712-b09781618a76","Type":"ContainerStarted","Data":"23638ae3ed1265eaf40439d0fec35a2389a58acb09ed9f8dbb4e361c39e741c9"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.818001 4962 generic.go:334] "Generic (PLEG): container finished" podID="a313fb19-8615-43b7-a19a-df83e50410ba" containerID="0524942f9aa37d9ad7a2475893e19f232848fd053a188018fd7e95cce3d53a2e" exitCode=0 Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.818103 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerDied","Data":"0524942f9aa37d9ad7a2475893e19f232848fd053a188018fd7e95cce3d53a2e"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.821552 4962 generic.go:334] "Generic (PLEG): container finished" podID="2f414667-865d-4c89-b470-50f61a11b60e" containerID="cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a" exitCode=0 Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.821615 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerDied","Data":"cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.824907 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerStarted","Data":"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb"} Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.829522 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-d4pzn" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.830077 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6q5bk" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.836378 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5qdmt" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.833080 4962 generic.go:334] "Generic (PLEG): container finished" podID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerID="1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb" exitCode=0 Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.833129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerDied","Data":"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb"} Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.835677 4962 generic.go:334] "Generic (PLEG): container finished" podID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerID="077fd78654c04cefa692b832e5d39f0ad8125a113696468af2f8fd05f987f05c" exitCode=0 Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.835736 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerDied","Data":"077fd78654c04cefa692b832e5d39f0ad8125a113696468af2f8fd05f987f05c"} Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.839305 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" event={"ID":"d590527b-ed56-4fb4-a712-b09781618a76","Type":"ContainerStarted","Data":"f9c20c25cd1b447d8c70fd86425c66912c97841c988de55739723e06f486c18f"} Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.888826 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5bwk2" podStartSLOduration=164.888804492 podStartE2EDuration="2m44.888804492s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:58:13.885819627 +0000 UTC m=+185.468291473" watchObservedRunningTime="2026-02-20 09:58:13.888804492 +0000 UTC m=+185.471276338" Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.859650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerStarted","Data":"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849"} Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.863876 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerStarted","Data":"e4ecdf4a3c339870dec24d15d23037eaffd696da078f222cbf4a6c2e3f4d4b42"} Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.866407 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerStarted","Data":"86f269adf873c3d131ecc0aa7aa18aecc38f46b209e217489bedfd8601293824"} Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.872640 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerStarted","Data":"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786"} Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.874753 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerStarted","Data":"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315"} Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.899129 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sxxjg" podStartSLOduration=3.062735422 podStartE2EDuration="37.8991146s" podCreationTimestamp="2026-02-20 09:57:38 +0000 UTC" firstStartedPulling="2026-02-20 09:57:40.455767942 +0000 UTC m=+152.038239778" lastFinishedPulling="2026-02-20 09:58:15.29214706 +0000 UTC m=+186.874618956" observedRunningTime="2026-02-20 09:58:15.884331731 +0000 UTC m=+187.466803577" watchObservedRunningTime="2026-02-20 09:58:15.8991146 +0000 UTC m=+187.481586446" Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.926927 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-grl4h" podStartSLOduration=3.019085772 podStartE2EDuration="34.9268875s" podCreationTimestamp="2026-02-20 09:57:41 +0000 UTC" firstStartedPulling="2026-02-20 09:57:43.524816422 +0000 UTC m=+155.107288268" lastFinishedPulling="2026-02-20 09:58:15.43261815 +0000 UTC m=+187.015089996" observedRunningTime="2026-02-20 09:58:15.92218267 +0000 UTC m=+187.504654526" watchObservedRunningTime="2026-02-20 09:58:15.9268875 +0000 UTC m=+187.509359346" Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.927436 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zwkjb" podStartSLOduration=3.039981371 podStartE2EDuration="34.927429916s" podCreationTimestamp="2026-02-20 09:57:41 +0000 UTC" firstStartedPulling="2026-02-20 09:57:43.518437539 +0000 UTC m=+155.100909385" lastFinishedPulling="2026-02-20 09:58:15.405886084 +0000 UTC m=+186.988357930" observedRunningTime="2026-02-20 09:58:15.903693135 +0000 UTC m=+187.486164981" watchObservedRunningTime="2026-02-20 09:58:15.927429916 +0000 UTC m=+187.509901762" Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.940144 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2gxn5" podStartSLOduration=4.81796757 podStartE2EDuration="35.940129029s" podCreationTimestamp="2026-02-20 09:57:40 +0000 UTC" firstStartedPulling="2026-02-20 09:57:43.529038195 +0000 UTC m=+155.111510041" lastFinishedPulling="2026-02-20 09:58:14.651199654 +0000 UTC m=+186.233671500" observedRunningTime="2026-02-20 09:58:15.940000005 +0000 UTC m=+187.522471861" watchObservedRunningTime="2026-02-20 09:58:15.940129029 +0000 UTC m=+187.522600875" Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.962086 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x4hxs" podStartSLOduration=2.06995262 podStartE2EDuration="35.962071064s" podCreationTimestamp="2026-02-20 09:57:40 +0000 UTC" firstStartedPulling="2026-02-20 09:57:41.487406294 +0000 UTC m=+153.069878140" lastFinishedPulling="2026-02-20 09:58:15.379524738 +0000 UTC m=+186.961996584" observedRunningTime="2026-02-20 09:58:15.960398681 +0000 UTC m=+187.542870527" watchObservedRunningTime="2026-02-20 09:58:15.962071064 +0000 UTC m=+187.544542900" Feb 20 09:58:16 crc kubenswrapper[4962]: I0220 09:58:16.405149 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:58:18 crc kubenswrapper[4962]: I0220 09:58:18.580454 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:58:18 crc kubenswrapper[4962]: I0220 09:58:18.580567 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:58:18 crc kubenswrapper[4962]: I0220 09:58:18.717302 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:58:18 crc kubenswrapper[4962]: I0220 09:58:18.829685 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.358815 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 09:58:20 crc kubenswrapper[4962]: E0220 09:58:20.359118 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9655a2-2b08-4827-8126-160f62910b6f" containerName="pruner" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.359133 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9655a2-2b08-4827-8126-160f62910b6f" containerName="pruner" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.359341 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d9655a2-2b08-4827-8126-160f62910b6f" containerName="pruner" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.359888 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.364885 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.365288 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.365414 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.510526 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.510577 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.579995 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.580634 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.611270 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.611320 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.611385 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.646818 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.721682 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.944106 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.978265 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.246031 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.276740 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.277018 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.328639 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.874185 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.874561 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.907321 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e","Type":"ContainerStarted","Data":"b56472da857691a5b95ad0cc186ba642ef9479130c88e2dcbaa014b7f15e8952"} Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.955124 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.197221 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.197274 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.234926 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.910659 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zwkjb" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="registry-server" probeResult="failure" output=< Feb 20 09:58:22 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 09:58:22 crc kubenswrapper[4962]: > Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.914433 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e","Type":"ContainerStarted","Data":"c37f32ab40c4dd0e0fc921352e227a03bbcfb1ba144d4b58dbdfb5d40638c3a6"} Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.930279 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.930254821 podStartE2EDuration="2.930254821s" podCreationTimestamp="2026-02-20 09:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:58:22.927924357 +0000 UTC m=+194.510396203" watchObservedRunningTime="2026-02-20 09:58:22.930254821 +0000 UTC m=+194.512726667" Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.956185 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:23 crc kubenswrapper[4962]: I0220 09:58:23.654920 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:58:23 crc kubenswrapper[4962]: I0220 09:58:23.920643 4962 generic.go:334] "Generic (PLEG): container finished" podID="5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" containerID="c37f32ab40c4dd0e0fc921352e227a03bbcfb1ba144d4b58dbdfb5d40638c3a6" exitCode=0 Feb 20 09:58:23 crc kubenswrapper[4962]: I0220 09:58:23.921838 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e","Type":"ContainerDied","Data":"c37f32ab40c4dd0e0fc921352e227a03bbcfb1ba144d4b58dbdfb5d40638c3a6"} Feb 20 09:58:23 crc kubenswrapper[4962]: I0220 09:58:23.922022 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2gxn5" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="registry-server" containerID="cri-o://86f269adf873c3d131ecc0aa7aa18aecc38f46b209e217489bedfd8601293824" gracePeriod=2 Feb 20 09:58:24 crc kubenswrapper[4962]: I0220 09:58:24.654699 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:58:24 crc kubenswrapper[4962]: I0220 09:58:24.927421 4962 generic.go:334] "Generic (PLEG): container finished" podID="a313fb19-8615-43b7-a19a-df83e50410ba" containerID="86f269adf873c3d131ecc0aa7aa18aecc38f46b209e217489bedfd8601293824" exitCode=0 Feb 20 09:58:24 crc kubenswrapper[4962]: I0220 09:58:24.927506 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerDied","Data":"86f269adf873c3d131ecc0aa7aa18aecc38f46b209e217489bedfd8601293824"} Feb 20 09:58:24 crc kubenswrapper[4962]: I0220 09:58:24.927674 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-grl4h" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="registry-server" containerID="cri-o://e4ecdf4a3c339870dec24d15d23037eaffd696da078f222cbf4a6c2e3f4d4b42" gracePeriod=2 Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.231292 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.257976 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.384774 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir\") pod \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.384861 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content\") pod \"a313fb19-8615-43b7-a19a-df83e50410ba\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.384886 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access\") pod \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.384902 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" (UID: "5d2fb8bc-5778-47f1-8bad-d7ba5a08848e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.384948 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities\") pod \"a313fb19-8615-43b7-a19a-df83e50410ba\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.385012 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvgwq\" (UniqueName: \"kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq\") pod \"a313fb19-8615-43b7-a19a-df83e50410ba\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.385275 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.385914 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities" (OuterVolumeSpecName: "utilities") pod "a313fb19-8615-43b7-a19a-df83e50410ba" (UID: "a313fb19-8615-43b7-a19a-df83e50410ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.391442 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" (UID: "5d2fb8bc-5778-47f1-8bad-d7ba5a08848e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.391556 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq" (OuterVolumeSpecName: "kube-api-access-vvgwq") pod "a313fb19-8615-43b7-a19a-df83e50410ba" (UID: "a313fb19-8615-43b7-a19a-df83e50410ba"). InnerVolumeSpecName "kube-api-access-vvgwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.409787 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a313fb19-8615-43b7-a19a-df83e50410ba" (UID: "a313fb19-8615-43b7-a19a-df83e50410ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.486221 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvgwq\" (UniqueName: \"kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.486260 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.486273 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.486283 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.934368 4962 generic.go:334] "Generic (PLEG): container finished" podID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerID="e4ecdf4a3c339870dec24d15d23037eaffd696da078f222cbf4a6c2e3f4d4b42" exitCode=0 Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.934449 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerDied","Data":"e4ecdf4a3c339870dec24d15d23037eaffd696da078f222cbf4a6c2e3f4d4b42"} Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.937021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerDied","Data":"1d342c2b45fca81302a1d91ac4539bc994004d9b4834928e5a7a0d50c28cc22c"} Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.937072 4962 scope.go:117] "RemoveContainer" containerID="86f269adf873c3d131ecc0aa7aa18aecc38f46b209e217489bedfd8601293824" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.937239 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.939247 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e","Type":"ContainerDied","Data":"b56472da857691a5b95ad0cc186ba642ef9479130c88e2dcbaa014b7f15e8952"} Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.939275 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56472da857691a5b95ad0cc186ba642ef9479130c88e2dcbaa014b7f15e8952" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.939838 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.957520 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 09:58:25 crc kubenswrapper[4962]: E0220 09:58:25.957884 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="extract-utilities" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.957896 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="extract-utilities" Feb 20 09:58:25 crc kubenswrapper[4962]: E0220 09:58:25.957923 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" containerName="pruner" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.957930 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" containerName="pruner" Feb 20 09:58:25 crc kubenswrapper[4962]: E0220 09:58:25.957938 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="extract-content" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.957944 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="extract-content" Feb 20 09:58:25 crc kubenswrapper[4962]: E0220 09:58:25.957953 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="registry-server" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.957958 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="registry-server" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.958050 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" containerName="pruner" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.958058 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="registry-server" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.960101 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.963316 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.964071 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.970027 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.996208 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.999214 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.092174 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.092261 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.092283 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.193066 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.193119 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.193192 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.193263 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.193353 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.209210 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.292825 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.145468 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" path="/var/lib/kubelet/pods/a313fb19-8615-43b7-a19a-df83e50410ba/volumes" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.543628 4962 scope.go:117] "RemoveContainer" containerID="0524942f9aa37d9ad7a2475893e19f232848fd053a188018fd7e95cce3d53a2e" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.631383 4962 scope.go:117] "RemoveContainer" containerID="67d79612188b2d4ac629f49494e406f3c95c85018ab3966df8e32115cfac1740" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.658315 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.762075 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.813101 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content\") pod \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.813206 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w94dl\" (UniqueName: \"kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl\") pod \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.813249 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities\") pod \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.815276 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities" (OuterVolumeSpecName: "utilities") pod "4065ac08-9c62-48db-bbfe-9e53ab7d5463" (UID: "4065ac08-9c62-48db-bbfe-9e53ab7d5463"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.818715 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl" (OuterVolumeSpecName: "kube-api-access-w94dl") pod "4065ac08-9c62-48db-bbfe-9e53ab7d5463" (UID: "4065ac08-9c62-48db-bbfe-9e53ab7d5463"). InnerVolumeSpecName "kube-api-access-w94dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.914332 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w94dl\" (UniqueName: \"kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.914357 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.935788 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4065ac08-9c62-48db-bbfe-9e53ab7d5463" (UID: "4065ac08-9c62-48db-bbfe-9e53ab7d5463"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.952613 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.952580 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerDied","Data":"50d62aa547d31448aec488d9bebb813e5df408a7d3796523fe1332f39c82e601"} Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.952662 4962 scope.go:117] "RemoveContainer" containerID="e4ecdf4a3c339870dec24d15d23037eaffd696da078f222cbf4a6c2e3f4d4b42" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.964705 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"55ba5b4b-9a58-40e7-a3a3-00764477f5a9","Type":"ContainerStarted","Data":"9b9dbdcac9ea0d9a44c9e69e43cace295320630fa87bf53a68f617de558a65af"} Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.966676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerStarted","Data":"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d"} Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.975710 4962 scope.go:117] "RemoveContainer" containerID="077fd78654c04cefa692b832e5d39f0ad8125a113696468af2f8fd05f987f05c" Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.000435 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.003239 4962 scope.go:117] "RemoveContainer" containerID="8158f54e7f263dc473b3aa7c7a0204ebc0dcdf73eaeb229d1a5efae6eb63d973" Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.006505 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.015420 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.621984 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.971957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"55ba5b4b-9a58-40e7-a3a3-00764477f5a9","Type":"ContainerStarted","Data":"d915b56a671459092c2c1eb6c3a687d96ecc073838917251978e78628f894691"} Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.974314 4962 generic.go:334] "Generic (PLEG): container finished" podID="805f4075-7fda-4a54-882f-c4fd160148a4" containerID="a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958" exitCode=0 Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.974520 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerDied","Data":"a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958"} Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.977444 4962 generic.go:334] "Generic (PLEG): container finished" podID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerID="61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d" exitCode=0 Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.977549 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerDied","Data":"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d"} Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.982746 4962 generic.go:334] "Generic (PLEG): container finished" podID="1c487f78-6735-4114-a45a-6c60ccef5983" containerID="24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79" exitCode=0 Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.982807 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerDied","Data":"24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79"} Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.988052 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.988040216 podStartE2EDuration="3.988040216s" podCreationTimestamp="2026-02-20 09:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:58:28.985926046 +0000 UTC m=+200.568397912" watchObservedRunningTime="2026-02-20 09:58:28.988040216 +0000 UTC m=+200.570512062" Feb 20 09:58:29 crc kubenswrapper[4962]: I0220 09:58:29.146912 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" path="/var/lib/kubelet/pods/4065ac08-9c62-48db-bbfe-9e53ab7d5463/volumes" Feb 20 09:58:29 crc kubenswrapper[4962]: I0220 09:58:29.992872 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerStarted","Data":"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386"} Feb 20 09:58:29 crc kubenswrapper[4962]: I0220 09:58:29.995709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerStarted","Data":"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0"} Feb 20 09:58:29 crc kubenswrapper[4962]: I0220 09:58:29.999233 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerStarted","Data":"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b"} Feb 20 09:58:30 crc kubenswrapper[4962]: I0220 09:58:30.011868 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d4pzn" podStartSLOduration=2.9540328110000003 podStartE2EDuration="52.011846542s" podCreationTimestamp="2026-02-20 09:57:38 +0000 UTC" firstStartedPulling="2026-02-20 09:57:40.447975995 +0000 UTC m=+152.030447861" lastFinishedPulling="2026-02-20 09:58:29.505789746 +0000 UTC m=+201.088261592" observedRunningTime="2026-02-20 09:58:30.008187263 +0000 UTC m=+201.590659099" watchObservedRunningTime="2026-02-20 09:58:30.011846542 +0000 UTC m=+201.594318388" Feb 20 09:58:30 crc kubenswrapper[4962]: I0220 09:58:30.024683 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6q5bk" podStartSLOduration=2.885081613 podStartE2EDuration="52.024660634s" podCreationTimestamp="2026-02-20 09:57:38 +0000 UTC" firstStartedPulling="2026-02-20 09:57:40.448759699 +0000 UTC m=+152.031231545" lastFinishedPulling="2026-02-20 09:58:29.58833872 +0000 UTC m=+201.170810566" observedRunningTime="2026-02-20 09:58:30.021759989 +0000 UTC m=+201.604231855" watchObservedRunningTime="2026-02-20 09:58:30.024660634 +0000 UTC m=+201.607132480" Feb 20 09:58:30 crc kubenswrapper[4962]: I0220 09:58:30.041146 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5qdmt" podStartSLOduration=3.05185998 podStartE2EDuration="52.041127895s" podCreationTimestamp="2026-02-20 09:57:38 +0000 UTC" firstStartedPulling="2026-02-20 09:57:40.445141535 +0000 UTC m=+152.027613381" lastFinishedPulling="2026-02-20 09:58:29.43440945 +0000 UTC m=+201.016881296" observedRunningTime="2026-02-20 09:58:30.039575935 +0000 UTC m=+201.622047781" watchObservedRunningTime="2026-02-20 09:58:30.041127895 +0000 UTC m=+201.623599741" Feb 20 09:58:31 crc kubenswrapper[4962]: I0220 09:58:31.916534 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:58:31 crc kubenswrapper[4962]: I0220 09:58:31.961004 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:58:38 crc kubenswrapper[4962]: I0220 09:58:38.781149 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:58:38 crc kubenswrapper[4962]: I0220 09:58:38.781820 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:58:38 crc kubenswrapper[4962]: I0220 09:58:38.828304 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.024863 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.024954 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.082582 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.116255 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.152372 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.314098 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.314161 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.400603 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:40 crc kubenswrapper[4962]: I0220 09:58:40.113898 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:40 crc kubenswrapper[4962]: I0220 09:58:40.713968 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.312215 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.312403 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5qdmt" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="registry-server" containerID="cri-o://213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0" gracePeriod=2 Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.507935 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.508005 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.508060 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.509799 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.510070 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432" gracePeriod=600 Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.713335 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.820004 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content\") pod \"805f4075-7fda-4a54-882f-c4fd160148a4\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.820344 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dszmr\" (UniqueName: \"kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr\") pod \"805f4075-7fda-4a54-882f-c4fd160148a4\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.820437 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities\") pod \"805f4075-7fda-4a54-882f-c4fd160148a4\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.821715 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities" (OuterVolumeSpecName: "utilities") pod "805f4075-7fda-4a54-882f-c4fd160148a4" (UID: "805f4075-7fda-4a54-882f-c4fd160148a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.827531 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr" (OuterVolumeSpecName: "kube-api-access-dszmr") pod "805f4075-7fda-4a54-882f-c4fd160148a4" (UID: "805f4075-7fda-4a54-882f-c4fd160148a4"). InnerVolumeSpecName "kube-api-access-dszmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.875901 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "805f4075-7fda-4a54-882f-c4fd160148a4" (UID: "805f4075-7fda-4a54-882f-c4fd160148a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.921959 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.921992 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dszmr\" (UniqueName: \"kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.922005 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.082895 4962 generic.go:334] "Generic (PLEG): container finished" podID="805f4075-7fda-4a54-882f-c4fd160148a4" containerID="213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0" exitCode=0 Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.082944 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerDied","Data":"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0"} Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.083366 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerDied","Data":"75a64fa0799e34c46374b381b4c7c1a53295cecd1a95e7229c96eb57af23d670"} Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.083390 4962 scope.go:117] "RemoveContainer" containerID="213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.082991 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.085615 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432" exitCode=0 Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.085849 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d4pzn" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="registry-server" containerID="cri-o://86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386" gracePeriod=2 Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.086115 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432"} Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.086150 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa"} Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.100556 4962 scope.go:117] "RemoveContainer" containerID="a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.124112 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.128347 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.144523 4962 scope.go:117] "RemoveContainer" containerID="d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.201332 4962 scope.go:117] "RemoveContainer" containerID="213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0" Feb 20 09:58:42 crc kubenswrapper[4962]: E0220 09:58:42.201821 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0\": container with ID starting with 213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0 not found: ID does not exist" containerID="213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.201849 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0"} err="failed to get container status \"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0\": rpc error: code = NotFound desc = could not find container \"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0\": container with ID starting with 213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0 not found: ID does not exist" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.201870 4962 scope.go:117] "RemoveContainer" containerID="a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958" Feb 20 09:58:42 crc kubenswrapper[4962]: E0220 09:58:42.202216 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958\": container with ID starting with a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958 not found: ID does not exist" containerID="a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.202263 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958"} err="failed to get container status \"a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958\": rpc error: code = NotFound desc = could not find container \"a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958\": container with ID starting with a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958 not found: ID does not exist" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.202289 4962 scope.go:117] "RemoveContainer" containerID="d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4" Feb 20 09:58:42 crc kubenswrapper[4962]: E0220 09:58:42.203025 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4\": container with ID starting with d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4 not found: ID does not exist" containerID="d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.203051 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4"} err="failed to get container status \"d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4\": rpc error: code = NotFound desc = could not find container \"d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4\": container with ID starting with d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4 not found: ID does not exist" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.357922 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.528820 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content\") pod \"1c487f78-6735-4114-a45a-6c60ccef5983\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.528907 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities\") pod \"1c487f78-6735-4114-a45a-6c60ccef5983\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.528940 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58dmf\" (UniqueName: \"kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf\") pod \"1c487f78-6735-4114-a45a-6c60ccef5983\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.529692 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities" (OuterVolumeSpecName: "utilities") pod "1c487f78-6735-4114-a45a-6c60ccef5983" (UID: "1c487f78-6735-4114-a45a-6c60ccef5983"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.532208 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf" (OuterVolumeSpecName: "kube-api-access-58dmf") pod "1c487f78-6735-4114-a45a-6c60ccef5983" (UID: "1c487f78-6735-4114-a45a-6c60ccef5983"). InnerVolumeSpecName "kube-api-access-58dmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.577126 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c487f78-6735-4114-a45a-6c60ccef5983" (UID: "1c487f78-6735-4114-a45a-6c60ccef5983"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.629690 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.629728 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58dmf\" (UniqueName: \"kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.629739 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.093578 4962 generic.go:334] "Generic (PLEG): container finished" podID="1c487f78-6735-4114-a45a-6c60ccef5983" containerID="86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386" exitCode=0 Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.093631 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerDied","Data":"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386"} Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.093684 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerDied","Data":"7fcd5743db242f51c1ff9cb31c18720e064f45718743b615c36cf8bb2d39f79d"} Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.093703 4962 scope.go:117] "RemoveContainer" containerID="86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.093699 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.109089 4962 scope.go:117] "RemoveContainer" containerID="24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.119305 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.125910 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.146474 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" path="/var/lib/kubelet/pods/1c487f78-6735-4114-a45a-6c60ccef5983/volumes" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.147141 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" path="/var/lib/kubelet/pods/805f4075-7fda-4a54-882f-c4fd160148a4/volumes" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.149724 4962 scope.go:117] "RemoveContainer" containerID="43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.166894 4962 scope.go:117] "RemoveContainer" containerID="86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386" Feb 20 09:58:43 crc kubenswrapper[4962]: E0220 09:58:43.170148 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386\": container with ID starting with 86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386 not found: ID does not exist" containerID="86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.170223 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386"} err="failed to get container status \"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386\": rpc error: code = NotFound desc = could not find container \"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386\": container with ID starting with 86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386 not found: ID does not exist" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.170260 4962 scope.go:117] "RemoveContainer" containerID="24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79" Feb 20 09:58:43 crc kubenswrapper[4962]: E0220 09:58:43.170698 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79\": container with ID starting with 24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79 not found: ID does not exist" containerID="24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.170742 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79"} err="failed to get container status \"24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79\": rpc error: code = NotFound desc = could not find container \"24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79\": container with ID starting with 24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79 not found: ID does not exist" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.170772 4962 scope.go:117] "RemoveContainer" containerID="43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25" Feb 20 09:58:43 crc kubenswrapper[4962]: E0220 09:58:43.171666 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25\": container with ID starting with 43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25 not found: ID does not exist" containerID="43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.171699 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25"} err="failed to get container status \"43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25\": rpc error: code = NotFound desc = could not find container \"43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25\": container with ID starting with 43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25 not found: ID does not exist" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.851267 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerName="oauth-openshift" containerID="cri-o://7cda19df238dbbc1faa67326833d89d8ee6218b1c4e8291b2668f05c3b4f21bf" gracePeriod=15 Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.100606 4962 generic.go:334] "Generic (PLEG): container finished" podID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerID="7cda19df238dbbc1faa67326833d89d8ee6218b1c4e8291b2668f05c3b4f21bf" exitCode=0 Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.100676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" event={"ID":"f8161f87-3814-4d02-84ff-b94b8b05c59e","Type":"ContainerDied","Data":"7cda19df238dbbc1faa67326833d89d8ee6218b1c4e8291b2668f05c3b4f21bf"} Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.184524 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350453 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350523 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350546 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350567 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350613 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350639 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350664 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350704 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350729 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350759 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350781 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350799 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350818 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c9lr\" (UniqueName: \"kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350836 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.351551 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.351835 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.351999 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.352759 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.353448 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.356546 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.357009 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.357512 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr" (OuterVolumeSpecName: "kube-api-access-2c9lr") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "kube-api-access-2c9lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.357532 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.357780 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.358102 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.359819 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.360846 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.364134 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452237 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452530 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452607 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452667 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452726 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c9lr\" (UniqueName: \"kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452784 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452849 4962 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452906 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452969 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.453024 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.453087 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.453143 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.453205 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.453270 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:45 crc kubenswrapper[4962]: I0220 09:58:45.110909 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" event={"ID":"f8161f87-3814-4d02-84ff-b94b8b05c59e","Type":"ContainerDied","Data":"5c096e9566721f19e8e59886a3dcebbecb0051a2d044d1f9485cf0be8b3c8877"} Feb 20 09:58:45 crc kubenswrapper[4962]: I0220 09:58:45.110996 4962 scope.go:117] "RemoveContainer" containerID="7cda19df238dbbc1faa67326833d89d8ee6218b1c4e8291b2668f05c3b4f21bf" Feb 20 09:58:45 crc kubenswrapper[4962]: I0220 09:58:45.111145 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:58:45 crc kubenswrapper[4962]: I0220 09:58:45.160492 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:58:45 crc kubenswrapper[4962]: I0220 09:58:45.167133 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:58:47 crc kubenswrapper[4962]: I0220 09:58:47.147368 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" path="/var/lib/kubelet/pods/f8161f87-3814-4d02-84ff-b94b8b05c59e/volumes" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.370221 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7857967b8b-hdxkw"] Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.370964 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.370979 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.370994 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerName="oauth-openshift" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371002 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerName="oauth-openshift" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371015 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371024 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371039 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371050 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371061 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371069 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371083 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371091 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371105 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371114 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371125 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371133 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371161 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371169 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371181 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371190 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371300 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerName="oauth-openshift" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371316 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371330 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371340 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371787 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.374252 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.374360 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.374795 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.376261 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.377470 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.377999 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.378135 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.378344 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.379067 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.379121 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.379220 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.380164 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.399420 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.401310 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7857967b8b-hdxkw"] Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.404830 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.406254 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553636 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-error\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553703 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-audit-policies\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553743 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-service-ca\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553767 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553872 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-router-certs\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-login\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554024 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554052 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554090 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554124 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1164f259-4f1f-498e-81a1-817747913204-audit-dir\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk2s5\" (UniqueName: \"kubernetes.io/projected/1164f259-4f1f-498e-81a1-817747913204-kube-api-access-wk2s5\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554175 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-session\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554206 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654551 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-login\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654625 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654655 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654674 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654698 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1164f259-4f1f-498e-81a1-817747913204-audit-dir\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654713 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk2s5\" (UniqueName: \"kubernetes.io/projected/1164f259-4f1f-498e-81a1-817747913204-kube-api-access-wk2s5\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654728 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-session\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654747 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654778 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-error\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654797 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-audit-policies\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-service-ca\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654835 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654843 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1164f259-4f1f-498e-81a1-817747913204-audit-dir\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654868 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.655354 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-router-certs\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.656323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-audit-policies\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.656368 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.656421 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-service-ca\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.656444 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.660324 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.660818 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.661023 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.661267 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-router-certs\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.662280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-session\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.671055 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-login\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.671280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk2s5\" (UniqueName: \"kubernetes.io/projected/1164f259-4f1f-498e-81a1-817747913204-kube-api-access-wk2s5\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.671820 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.676118 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-error\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.705958 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.891257 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7857967b8b-hdxkw"] Feb 20 09:58:52 crc kubenswrapper[4962]: W0220 09:58:52.900785 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1164f259_4f1f_498e_81a1_817747913204.slice/crio-390d1111bb162049705f085556ff83fb04174d392ca6938fc115e2d4e1649d75 WatchSource:0}: Error finding container 390d1111bb162049705f085556ff83fb04174d392ca6938fc115e2d4e1649d75: Status 404 returned error can't find the container with id 390d1111bb162049705f085556ff83fb04174d392ca6938fc115e2d4e1649d75 Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.160910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" event={"ID":"1164f259-4f1f-498e-81a1-817747913204","Type":"ContainerStarted","Data":"e29fc665c030d5623ab5549a121dd1b3a715ea255ab201968b029743202928c1"} Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.161833 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.161973 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" event={"ID":"1164f259-4f1f-498e-81a1-817747913204","Type":"ContainerStarted","Data":"390d1111bb162049705f085556ff83fb04174d392ca6938fc115e2d4e1649d75"} Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.163747 4962 patch_prober.go:28] interesting pod/oauth-openshift-7857967b8b-hdxkw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" start-of-body= Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.163920 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" podUID="1164f259-4f1f-498e-81a1-817747913204" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.185236 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" podStartSLOduration=35.185214901 podStartE2EDuration="35.185214901s" podCreationTimestamp="2026-02-20 09:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:58:53.181348434 +0000 UTC m=+224.763820280" watchObservedRunningTime="2026-02-20 09:58:53.185214901 +0000 UTC m=+224.767686757" Feb 20 09:58:54 crc kubenswrapper[4962]: I0220 09:58:54.173384 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.568011 4962 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.569296 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.570728 4962 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571122 4962 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571302 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01" gracePeriod=15 Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571317 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5" gracePeriod=15 Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571294 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c" gracePeriod=15 Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571385 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571422 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571433 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571438 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571445 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571452 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571464 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571471 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571482 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571487 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571496 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571501 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571354 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0" gracePeriod=15 Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571536 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc" gracePeriod=15 Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571512 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571772 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571926 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571942 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571955 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571976 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571989 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.572001 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.615158 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619386 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619479 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619514 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619552 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619574 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619626 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619659 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619681 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721525 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721683 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721716 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721786 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721834 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721865 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721925 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722023 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722185 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722241 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722330 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722371 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.906033 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.936175 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.103:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895ec07a01ca766 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 09:59:05.935316838 +0000 UTC m=+237.517788684,LastTimestamp:2026-02-20 09:59:05.935316838 +0000 UTC m=+237.517788684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.236903 4962 generic.go:334] "Generic (PLEG): container finished" podID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" containerID="d915b56a671459092c2c1eb6c3a687d96ecc073838917251978e78628f894691" exitCode=0 Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.237014 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"55ba5b4b-9a58-40e7-a3a3-00764477f5a9","Type":"ContainerDied","Data":"d915b56a671459092c2c1eb6c3a687d96ecc073838917251978e78628f894691"} Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.238253 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.238794 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.239183 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.239362 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7"} Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.239398 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7089e214c78fca8c3b4cb7548e34b64d3996ba2d47307a89e8dc936ed301704b"} Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.239853 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.240463 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.241307 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.242981 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244173 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244682 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0" exitCode=0 Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244702 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c" exitCode=0 Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244708 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01" exitCode=0 Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244715 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5" exitCode=2 Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244735 4962 scope.go:117] "RemoveContainer" containerID="e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.274399 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.562636 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.563757 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.564449 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.665630 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.666045 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.676534 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.679308 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.679632 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.679660 4962 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.680041 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="200ms" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.680346 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock\") pod \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.680524 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access\") pod \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.680728 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock" (OuterVolumeSpecName: "var-lock") pod "55ba5b4b-9a58-40e7-a3a3-00764477f5a9" (UID: "55ba5b4b-9a58-40e7-a3a3-00764477f5a9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.680748 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir\") pod \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.681230 4962 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.681631 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "55ba5b4b-9a58-40e7-a3a3-00764477f5a9" (UID: "55ba5b4b-9a58-40e7-a3a3-00764477f5a9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.696245 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "55ba5b4b-9a58-40e7-a3a3-00764477f5a9" (UID: "55ba5b4b-9a58-40e7-a3a3-00764477f5a9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.782676 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.782718 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.882564 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="400ms" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.980401 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.981505 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.982229 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.982559 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.983065 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984396 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984460 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984490 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984703 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984757 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984819 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.085453 4962 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.085713 4962 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.085723 4962 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.283460 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="800ms" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.285367 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.286785 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc" exitCode=0 Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.286931 4962 scope.go:117] "RemoveContainer" containerID="f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.287008 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.289381 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"55ba5b4b-9a58-40e7-a3a3-00764477f5a9","Type":"ContainerDied","Data":"9b9dbdcac9ea0d9a44c9e69e43cace295320630fa87bf53a68f617de558a65af"} Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.289417 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b9dbdcac9ea0d9a44c9e69e43cace295320630fa87bf53a68f617de558a65af" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.289484 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.310132 4962 scope.go:117] "RemoveContainer" containerID="1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.316207 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.316689 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.317445 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.318177 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.318895 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.319312 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.330161 4962 scope.go:117] "RemoveContainer" containerID="c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.352730 4962 scope.go:117] "RemoveContainer" containerID="fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.370250 4962 scope.go:117] "RemoveContainer" containerID="8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.392472 4962 scope.go:117] "RemoveContainer" containerID="6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.424932 4962 scope.go:117] "RemoveContainer" containerID="f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.425790 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\": container with ID starting with f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0 not found: ID does not exist" containerID="f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.425827 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0"} err="failed to get container status \"f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\": rpc error: code = NotFound desc = could not find container \"f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\": container with ID starting with f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0 not found: ID does not exist" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.425850 4962 scope.go:117] "RemoveContainer" containerID="1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.426141 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\": container with ID starting with 1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c not found: ID does not exist" containerID="1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.426192 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c"} err="failed to get container status \"1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\": rpc error: code = NotFound desc = could not find container \"1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\": container with ID starting with 1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c not found: ID does not exist" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.426211 4962 scope.go:117] "RemoveContainer" containerID="c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.426557 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\": container with ID starting with c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01 not found: ID does not exist" containerID="c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.426698 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01"} err="failed to get container status \"c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\": rpc error: code = NotFound desc = could not find container \"c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\": container with ID starting with c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01 not found: ID does not exist" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.426829 4962 scope.go:117] "RemoveContainer" containerID="fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.427511 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\": container with ID starting with fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5 not found: ID does not exist" containerID="fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.427645 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5"} err="failed to get container status \"fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\": rpc error: code = NotFound desc = could not find container \"fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\": container with ID starting with fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5 not found: ID does not exist" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.427740 4962 scope.go:117] "RemoveContainer" containerID="8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.428685 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\": container with ID starting with 8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc not found: ID does not exist" containerID="8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.428738 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc"} err="failed to get container status \"8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\": rpc error: code = NotFound desc = could not find container \"8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\": container with ID starting with 8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc not found: ID does not exist" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.428779 4962 scope.go:117] "RemoveContainer" containerID="6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.429378 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\": container with ID starting with 6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04 not found: ID does not exist" containerID="6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.429464 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04"} err="failed to get container status \"6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\": rpc error: code = NotFound desc = could not find container \"6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\": container with ID starting with 6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04 not found: ID does not exist" Feb 20 09:59:09 crc kubenswrapper[4962]: E0220 09:59:09.085549 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="1.6s" Feb 20 09:59:09 crc kubenswrapper[4962]: I0220 09:59:09.142562 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:09 crc kubenswrapper[4962]: I0220 09:59:09.142939 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:09 crc kubenswrapper[4962]: I0220 09:59:09.143519 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:09 crc kubenswrapper[4962]: I0220 09:59:09.150464 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 20 09:59:10 crc kubenswrapper[4962]: E0220 09:59:10.686741 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="3.2s" Feb 20 09:59:13 crc kubenswrapper[4962]: E0220 09:59:13.888133 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="6.4s" Feb 20 09:59:13 crc kubenswrapper[4962]: E0220 09:59:13.944882 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.103:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895ec07a01ca766 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 09:59:05.935316838 +0000 UTC m=+237.517788684,LastTimestamp:2026-02-20 09:59:05.935316838 +0000 UTC m=+237.517788684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.138717 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.140013 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.140715 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.157288 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.157329 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:18 crc kubenswrapper[4962]: E0220 09:59:18.158045 4962 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.158846 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.354272 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c238ddfc7fab35ecb5482258da3b658be3500491461c24047354a98eb6a27f64"} Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.148677 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.149650 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.149851 4962 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.363017 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.363157 4962 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce" exitCode=1 Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.363203 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce"} Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.363840 4962 scope.go:117] "RemoveContainer" containerID="1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.364013 4962 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.364503 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365372 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365548 4962 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="bb2b9882d8095d2704e3f8cba7c2cdd33ad274119271e735b71a0c38b3733d31" exitCode=0 Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365575 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"bb2b9882d8095d2704e3f8cba7c2cdd33ad274119271e735b71a0c38b3733d31"} Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365780 4962 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365902 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365931 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:19 crc kubenswrapper[4962]: E0220 09:59:19.366256 4962 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.366576 4962 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.366879 4962 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.367163 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.367384 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.380889 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f9aec2bc3a9f2b3c8a19b0af95d99106f14a4ab629a1908b28f64cc2ef6d06d0"} Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.381250 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8f204e6a4667241d9c4421e17e7d6907b9abc4384380a51d11622983c8002b1b"} Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.381267 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"260a8f62c826968d9b2df75e761a19b013bee69404c14d6dbde556f2879025e0"} Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.381280 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2cf6a1fa04cc10bc933a2532bafa890f666b3d24cd238d729a7466aad6819739"} Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.389401 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.389563 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9338aa40f3b1f6e7f18273465539b91c996c7687cd237637ca783e0d5f9e51a5"} Feb 20 09:59:21 crc kubenswrapper[4962]: I0220 09:59:21.399540 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ebb3720158348b5d0d1f1ae9565aa45deed26ebb9a228e63f9bfe291fde16b7d"} Feb 20 09:59:21 crc kubenswrapper[4962]: I0220 09:59:21.400149 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:21 crc kubenswrapper[4962]: I0220 09:59:21.400167 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:21 crc kubenswrapper[4962]: I0220 09:59:21.400448 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:22 crc kubenswrapper[4962]: I0220 09:59:22.458070 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:59:22 crc kubenswrapper[4962]: I0220 09:59:22.458291 4962 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 09:59:22 crc kubenswrapper[4962]: I0220 09:59:22.458328 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 09:59:23 crc kubenswrapper[4962]: I0220 09:59:23.159197 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:23 crc kubenswrapper[4962]: I0220 09:59:23.159260 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:23 crc kubenswrapper[4962]: I0220 09:59:23.168649 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:26 crc kubenswrapper[4962]: I0220 09:59:26.429431 4962 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:26 crc kubenswrapper[4962]: I0220 09:59:26.570407 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:59:27 crc kubenswrapper[4962]: I0220 09:59:27.438440 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:27 crc kubenswrapper[4962]: I0220 09:59:27.438491 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:27 crc kubenswrapper[4962]: I0220 09:59:27.445394 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:27 crc kubenswrapper[4962]: I0220 09:59:27.450281 4962 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="032c0e04-2eae-47f8-94e5-4e93feb99a65" Feb 20 09:59:28 crc kubenswrapper[4962]: I0220 09:59:28.443165 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:28 crc kubenswrapper[4962]: I0220 09:59:28.443866 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:29 crc kubenswrapper[4962]: I0220 09:59:29.173809 4962 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="032c0e04-2eae-47f8-94e5-4e93feb99a65" Feb 20 09:59:32 crc kubenswrapper[4962]: I0220 09:59:32.458210 4962 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 09:59:32 crc kubenswrapper[4962]: I0220 09:59:32.458815 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 09:59:35 crc kubenswrapper[4962]: I0220 09:59:35.952512 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 20 09:59:35 crc kubenswrapper[4962]: I0220 09:59:35.983408 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 20 09:59:36 crc kubenswrapper[4962]: I0220 09:59:36.708518 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 20 09:59:36 crc kubenswrapper[4962]: I0220 09:59:36.737246 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 09:59:36 crc kubenswrapper[4962]: I0220 09:59:36.763356 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 20 09:59:37 crc kubenswrapper[4962]: I0220 09:59:37.324210 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 20 09:59:37 crc kubenswrapper[4962]: I0220 09:59:37.554934 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 20 09:59:37 crc kubenswrapper[4962]: I0220 09:59:37.587780 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.167401 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.216424 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.334644 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.464096 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.464550 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.467188 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.609297 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.786329 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.793605 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.951755 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.975573 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.003179 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.034436 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.273495 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.316623 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.474346 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.673316 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.716787 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.026473 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.351183 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.385931 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.421587 4962 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.611006 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.694734 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.696633 4962 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.703188 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=35.703150216 podStartE2EDuration="35.703150216s" podCreationTimestamp="2026-02-20 09:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:59:25.991504813 +0000 UTC m=+257.573976689" watchObservedRunningTime="2026-02-20 09:59:40.703150216 +0000 UTC m=+272.285622102" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.706166 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.706237 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.711369 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.726951 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.726926615 podStartE2EDuration="14.726926615s" podCreationTimestamp="2026-02-20 09:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:59:40.723685433 +0000 UTC m=+272.306157299" watchObservedRunningTime="2026-02-20 09:59:40.726926615 +0000 UTC m=+272.309398471" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.790262 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.912479 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.913171 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.927852 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.970327 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.976043 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.993887 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.002338 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.008120 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.054752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.135774 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.218914 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.473147 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.555740 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.749814 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.888885 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.906814 4962 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.043701 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.197857 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.318790 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.333933 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.376550 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.458228 4962 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.458359 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.458452 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.459446 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"9338aa40f3b1f6e7f18273465539b91c996c7687cd237637ca783e0d5f9e51a5"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.459639 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://9338aa40f3b1f6e7f18273465539b91c996c7687cd237637ca783e0d5f9e51a5" gracePeriod=30 Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.471891 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.561239 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.561415 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.577282 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.578316 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.596755 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.600512 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.605405 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.628113 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.678893 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.692704 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.758173 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.779817 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.812781 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.824470 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.830881 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.875825 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.900901 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.914213 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.917076 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.921399 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.932808 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.959448 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.054217 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.098641 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.188392 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.291510 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.404080 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.449334 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.450729 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.475715 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.483701 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.679640 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.715500 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.799360 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.823182 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.836573 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.842942 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.846003 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.869776 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.887185 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.921477 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.934069 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.943806 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.151757 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.160938 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.360677 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.395019 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.395103 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.424691 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.480228 4962 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.509790 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.604109 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.620499 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.658883 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.683769 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.687992 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.736744 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.757633 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.801048 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.855110 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.914583 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.968197 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.981507 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.074125 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.077497 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.117821 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.154537 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.177339 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.184934 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.240707 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.295078 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.497661 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.503937 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.523857 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.531926 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.630560 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.698205 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.720949 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.748940 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.789262 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.800890 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.815312 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.886490 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.911326 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.938437 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.992037 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.069234 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.079095 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.194019 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.254619 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.257354 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.280285 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.304964 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.330507 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.341568 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.343995 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.396499 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.423941 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.776103 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.799083 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.852696 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.852732 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.925518 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.935869 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.140770 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.182838 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.192148 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.214855 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.312098 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.336532 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.405309 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.489883 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.512100 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.540540 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.577909 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.578039 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.638449 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.643328 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.658855 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.662045 4962 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.689994 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.719269 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.751201 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.926798 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.956465 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.989290 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.012043 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.126410 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.149156 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.253691 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.308802 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.314766 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.327278 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.330885 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.395697 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.627416 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.647109 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.667768 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.675977 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.690567 4962 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.690825 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7" gracePeriod=5 Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.844676 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.885717 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.000902 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.018892 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.075835 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.107442 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.113638 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.164488 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.165259 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.232203 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.270038 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.307826 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.412024 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.427829 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.478722 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.480743 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.602791 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.732868 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.768503 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.768773 4962 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.811555 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.822198 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.988530 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.063629 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.082702 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.173724 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.194033 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.357405 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.474743 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.483666 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.593927 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.598398 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.612229 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.733989 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.003548 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.127623 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.251362 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.264536 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.272523 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.310325 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.327457 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.639638 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.667115 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.909663 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.909677 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.991468 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 20 09:59:52 crc kubenswrapper[4962]: I0220 09:59:52.259830 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 20 09:59:52 crc kubenswrapper[4962]: I0220 09:59:52.319957 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 20 09:59:52 crc kubenswrapper[4962]: I0220 09:59:52.402104 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 20 09:59:52 crc kubenswrapper[4962]: I0220 09:59:52.640582 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 20 09:59:52 crc kubenswrapper[4962]: I0220 09:59:52.670468 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 20 09:59:53 crc kubenswrapper[4962]: I0220 09:59:53.023621 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 20 09:59:53 crc kubenswrapper[4962]: I0220 09:59:53.092391 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 09:59:53 crc kubenswrapper[4962]: I0220 09:59:53.313427 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.380584 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.380671 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520016 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520154 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520192 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520181 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520216 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520298 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520322 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520465 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520544 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.521121 4962 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.521166 4962 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.521185 4962 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.521204 4962 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.534016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.613387 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.622066 4962 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.627113 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.627260 4962 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7" exitCode=137 Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.627350 4962 scope.go:117] "RemoveContainer" containerID="77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.627809 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.649251 4962 scope.go:117] "RemoveContainer" containerID="77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7" Feb 20 09:59:54 crc kubenswrapper[4962]: E0220 09:59:54.649806 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7\": container with ID starting with 77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7 not found: ID does not exist" containerID="77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.649838 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7"} err="failed to get container status \"77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7\": rpc error: code = NotFound desc = could not find container \"77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7\": container with ID starting with 77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7 not found: ID does not exist" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.786721 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.146839 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.147130 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.157966 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.158002 4962 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="35a53ee2-5747-4fe8-89c7-97453524e674" Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.161911 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.161954 4962 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="35a53ee2-5747-4fe8-89c7-97453524e674" Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.418788 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.875554 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.877888 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sxxjg" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="registry-server" containerID="cri-o://c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849" gracePeriod=30 Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.881467 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.881955 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6q5bk" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="registry-server" containerID="cri-o://ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b" gracePeriod=30 Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.896353 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.896626 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" containerID="cri-o://3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104" gracePeriod=30 Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.901737 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.902007 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x4hxs" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="registry-server" containerID="cri-o://4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786" gracePeriod=30 Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.905553 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.907730 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zwkjb" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="registry-server" containerID="cri-o://24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315" gracePeriod=30 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.327817 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.334021 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.338719 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.341914 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.344460 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474403 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpnrc\" (UniqueName: \"kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc\") pod \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474459 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities\") pod \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474510 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content\") pod \"ee660135-f5e2-420e-a242-440471e57da2\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474558 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics\") pod \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474578 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt6vz\" (UniqueName: \"kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz\") pod \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474623 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dxf6\" (UniqueName: \"kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6\") pod \"2f414667-865d-4c89-b470-50f61a11b60e\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474670 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vspvg\" (UniqueName: \"kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg\") pod \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474698 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities\") pod \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474728 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqtxr\" (UniqueName: \"kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr\") pod \"ee660135-f5e2-420e-a242-440471e57da2\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474768 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content\") pod \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474801 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content\") pod \"2f414667-865d-4c89-b470-50f61a11b60e\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474820 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities\") pod \"ee660135-f5e2-420e-a242-440471e57da2\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474846 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca\") pod \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474866 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content\") pod \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474887 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities\") pod \"2f414667-865d-4c89-b470-50f61a11b60e\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.478069 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities" (OuterVolumeSpecName: "utilities") pod "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" (UID: "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.478267 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "bfd57a5c-0892-46a0-8005-0a8f70c146fd" (UID: "bfd57a5c-0892-46a0-8005-0a8f70c146fd"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.478667 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities" (OuterVolumeSpecName: "utilities") pod "ee660135-f5e2-420e-a242-440471e57da2" (UID: "ee660135-f5e2-420e-a242-440471e57da2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.479783 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities" (OuterVolumeSpecName: "utilities") pod "77564a1c-aefc-4caf-86d9-55c2ef795bb7" (UID: "77564a1c-aefc-4caf-86d9-55c2ef795bb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.481017 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities" (OuterVolumeSpecName: "utilities") pod "2f414667-865d-4c89-b470-50f61a11b60e" (UID: "2f414667-865d-4c89-b470-50f61a11b60e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.483999 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6" (OuterVolumeSpecName: "kube-api-access-4dxf6") pod "2f414667-865d-4c89-b470-50f61a11b60e" (UID: "2f414667-865d-4c89-b470-50f61a11b60e"). InnerVolumeSpecName "kube-api-access-4dxf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.484477 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr" (OuterVolumeSpecName: "kube-api-access-hqtxr") pod "ee660135-f5e2-420e-a242-440471e57da2" (UID: "ee660135-f5e2-420e-a242-440471e57da2"). InnerVolumeSpecName "kube-api-access-hqtxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.486006 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz" (OuterVolumeSpecName: "kube-api-access-rt6vz") pod "bfd57a5c-0892-46a0-8005-0a8f70c146fd" (UID: "bfd57a5c-0892-46a0-8005-0a8f70c146fd"). InnerVolumeSpecName "kube-api-access-rt6vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.486328 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg" (OuterVolumeSpecName: "kube-api-access-vspvg") pod "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" (UID: "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b"). InnerVolumeSpecName "kube-api-access-vspvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.488298 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "bfd57a5c-0892-46a0-8005-0a8f70c146fd" (UID: "bfd57a5c-0892-46a0-8005-0a8f70c146fd"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.488342 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc" (OuterVolumeSpecName: "kube-api-access-wpnrc") pod "77564a1c-aefc-4caf-86d9-55c2ef795bb7" (UID: "77564a1c-aefc-4caf-86d9-55c2ef795bb7"). InnerVolumeSpecName "kube-api-access-wpnrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.510388 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f414667-865d-4c89-b470-50f61a11b60e" (UID: "2f414667-865d-4c89-b470-50f61a11b60e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.550027 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee660135-f5e2-420e-a242-440471e57da2" (UID: "ee660135-f5e2-420e-a242-440471e57da2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576096 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576127 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576145 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt6vz\" (UniqueName: \"kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576157 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dxf6\" (UniqueName: \"kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576168 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vspvg\" (UniqueName: \"kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576181 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576191 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqtxr\" (UniqueName: \"kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576203 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576218 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576229 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576270 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576283 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpnrc\" (UniqueName: \"kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576294 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.585910 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" (UID: "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.629855 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77564a1c-aefc-4caf-86d9-55c2ef795bb7" (UID: "77564a1c-aefc-4caf-86d9-55c2ef795bb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.677915 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.677963 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.704713 4962 generic.go:334] "Generic (PLEG): container finished" podID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerID="3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104" exitCode=0 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.704830 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.704866 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" event={"ID":"bfd57a5c-0892-46a0-8005-0a8f70c146fd","Type":"ContainerDied","Data":"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.705035 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" event={"ID":"bfd57a5c-0892-46a0-8005-0a8f70c146fd","Type":"ContainerDied","Data":"c81440f2bd45daadf6efa1fe9a3de8fa8cfa794ff12c8106c2aad73b69faa130"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.705077 4962 scope.go:117] "RemoveContainer" containerID="3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.709236 4962 generic.go:334] "Generic (PLEG): container finished" podID="2f414667-865d-4c89-b470-50f61a11b60e" containerID="4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786" exitCode=0 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.709331 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.709331 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerDied","Data":"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.709427 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerDied","Data":"4ba52fa324168e2ee08b42cdefbfc041b14744aa5c09a51cbc5628b6f08e9f57"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.715778 4962 generic.go:334] "Generic (PLEG): container finished" podID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerID="24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315" exitCode=0 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.715896 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.716018 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerDied","Data":"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.716118 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerDied","Data":"962bcd42638f9814f3627f1f0129094057257d45837d02365bb3acaa7e0e1287"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.721099 4962 generic.go:334] "Generic (PLEG): container finished" podID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerID="ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b" exitCode=0 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.721185 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.721213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerDied","Data":"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.721295 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerDied","Data":"dd70ef6c640a62edc318879e7e0b88b18026337e7b55ef136a0601bdad9e609c"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.727924 4962 generic.go:334] "Generic (PLEG): container finished" podID="ee660135-f5e2-420e-a242-440471e57da2" containerID="c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849" exitCode=0 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.727979 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerDied","Data":"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.728034 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerDied","Data":"ca9165de15eb6be88321d9382abfd900310af2761fe4b1a318a48f4e2a654377"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.728059 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.742288 4962 scope.go:117] "RemoveContainer" containerID="3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.743182 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104\": container with ID starting with 3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104 not found: ID does not exist" containerID="3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.743226 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104"} err="failed to get container status \"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104\": rpc error: code = NotFound desc = could not find container \"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104\": container with ID starting with 3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.743250 4962 scope.go:117] "RemoveContainer" containerID="4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.756347 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.756395 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.767998 4962 scope.go:117] "RemoveContainer" containerID="cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.768630 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.784757 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.790139 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.803069 4962 scope.go:117] "RemoveContainer" containerID="4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.809062 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.821259 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.824065 4962 scope.go:117] "RemoveContainer" containerID="4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.824662 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786\": container with ID starting with 4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786 not found: ID does not exist" containerID="4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.824746 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786"} err="failed to get container status \"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786\": rpc error: code = NotFound desc = could not find container \"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786\": container with ID starting with 4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.824783 4962 scope.go:117] "RemoveContainer" containerID="cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.825079 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a\": container with ID starting with cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a not found: ID does not exist" containerID="cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.825099 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a"} err="failed to get container status \"cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a\": rpc error: code = NotFound desc = could not find container \"cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a\": container with ID starting with cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.825113 4962 scope.go:117] "RemoveContainer" containerID="4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.825389 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf\": container with ID starting with 4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf not found: ID does not exist" containerID="4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.825551 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf"} err="failed to get container status \"4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf\": rpc error: code = NotFound desc = could not find container \"4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf\": container with ID starting with 4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.825635 4962 scope.go:117] "RemoveContainer" containerID="24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.828648 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.831170 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.836380 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.842778 4962 scope.go:117] "RemoveContainer" containerID="1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.869249 4962 scope.go:117] "RemoveContainer" containerID="7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.886411 4962 scope.go:117] "RemoveContainer" containerID="24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.887170 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315\": container with ID starting with 24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315 not found: ID does not exist" containerID="24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.887220 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315"} err="failed to get container status \"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315\": rpc error: code = NotFound desc = could not find container \"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315\": container with ID starting with 24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.887253 4962 scope.go:117] "RemoveContainer" containerID="1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.887814 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb\": container with ID starting with 1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb not found: ID does not exist" containerID="1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.887860 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb"} err="failed to get container status \"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb\": rpc error: code = NotFound desc = could not find container \"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb\": container with ID starting with 1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.887893 4962 scope.go:117] "RemoveContainer" containerID="7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.888193 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1\": container with ID starting with 7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1 not found: ID does not exist" containerID="7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.888226 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1"} err="failed to get container status \"7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1\": rpc error: code = NotFound desc = could not find container \"7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1\": container with ID starting with 7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.888245 4962 scope.go:117] "RemoveContainer" containerID="ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.901986 4962 scope.go:117] "RemoveContainer" containerID="61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.922053 4962 scope.go:117] "RemoveContainer" containerID="9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.944668 4962 scope.go:117] "RemoveContainer" containerID="ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.945129 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b\": container with ID starting with ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b not found: ID does not exist" containerID="ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.945161 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b"} err="failed to get container status \"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b\": rpc error: code = NotFound desc = could not find container \"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b\": container with ID starting with ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.945184 4962 scope.go:117] "RemoveContainer" containerID="61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.946102 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d\": container with ID starting with 61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d not found: ID does not exist" containerID="61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.946135 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d"} err="failed to get container status \"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d\": rpc error: code = NotFound desc = could not find container \"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d\": container with ID starting with 61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.946158 4962 scope.go:117] "RemoveContainer" containerID="9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.946583 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c\": container with ID starting with 9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c not found: ID does not exist" containerID="9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.946713 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c"} err="failed to get container status \"9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c\": rpc error: code = NotFound desc = could not find container \"9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c\": container with ID starting with 9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.946768 4962 scope.go:117] "RemoveContainer" containerID="c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.960782 4962 scope.go:117] "RemoveContainer" containerID="8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.980506 4962 scope.go:117] "RemoveContainer" containerID="eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.995687 4962 scope.go:117] "RemoveContainer" containerID="c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.996213 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849\": container with ID starting with c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849 not found: ID does not exist" containerID="c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.996260 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849"} err="failed to get container status \"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849\": rpc error: code = NotFound desc = could not find container \"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849\": container with ID starting with c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.996293 4962 scope.go:117] "RemoveContainer" containerID="8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.996755 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8\": container with ID starting with 8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8 not found: ID does not exist" containerID="8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.996777 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8"} err="failed to get container status \"8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8\": rpc error: code = NotFound desc = could not find container \"8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8\": container with ID starting with 8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.996796 4962 scope.go:117] "RemoveContainer" containerID="eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.997922 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c\": container with ID starting with eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c not found: ID does not exist" containerID="eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.997962 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c"} err="failed to get container status \"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c\": rpc error: code = NotFound desc = could not find container \"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c\": container with ID starting with eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c not found: ID does not exist" Feb 20 10:00:07 crc kubenswrapper[4962]: I0220 10:00:07.146732 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f414667-865d-4c89-b470-50f61a11b60e" path="/var/lib/kubelet/pods/2f414667-865d-4c89-b470-50f61a11b60e/volumes" Feb 20 10:00:07 crc kubenswrapper[4962]: I0220 10:00:07.147734 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" path="/var/lib/kubelet/pods/77564a1c-aefc-4caf-86d9-55c2ef795bb7/volumes" Feb 20 10:00:07 crc kubenswrapper[4962]: I0220 10:00:07.148381 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" path="/var/lib/kubelet/pods/bfd57a5c-0892-46a0-8005-0a8f70c146fd/volumes" Feb 20 10:00:07 crc kubenswrapper[4962]: I0220 10:00:07.149762 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" path="/var/lib/kubelet/pods/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b/volumes" Feb 20 10:00:07 crc kubenswrapper[4962]: I0220 10:00:07.150389 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee660135-f5e2-420e-a242-440471e57da2" path="/var/lib/kubelet/pods/ee660135-f5e2-420e-a242-440471e57da2/volumes" Feb 20 10:00:08 crc kubenswrapper[4962]: I0220 10:00:08.898854 4962 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 20 10:00:12 crc kubenswrapper[4962]: I0220 10:00:12.770995 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 20 10:00:12 crc kubenswrapper[4962]: I0220 10:00:12.774501 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 10:00:12 crc kubenswrapper[4962]: I0220 10:00:12.774629 4962 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9338aa40f3b1f6e7f18273465539b91c996c7687cd237637ca783e0d5f9e51a5" exitCode=137 Feb 20 10:00:12 crc kubenswrapper[4962]: I0220 10:00:12.774664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9338aa40f3b1f6e7f18273465539b91c996c7687cd237637ca783e0d5f9e51a5"} Feb 20 10:00:12 crc kubenswrapper[4962]: I0220 10:00:12.774809 4962 scope.go:117] "RemoveContainer" containerID="1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce" Feb 20 10:00:13 crc kubenswrapper[4962]: I0220 10:00:13.782835 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 20 10:00:13 crc kubenswrapper[4962]: I0220 10:00:13.784095 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"41526e761d4a88d1fb89f92c4de23bad31a37ec70f99f184c60c52623f2183f3"} Feb 20 10:00:16 crc kubenswrapper[4962]: I0220 10:00:16.570425 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 10:00:22 crc kubenswrapper[4962]: I0220 10:00:22.458070 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 10:00:22 crc kubenswrapper[4962]: I0220 10:00:22.462426 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 10:00:26 crc kubenswrapper[4962]: I0220 10:00:26.576964 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.850065 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.850841 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerName="controller-manager" containerID="cri-o://d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f" gracePeriod=30 Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.853150 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.853396 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" podUID="8da2028c-f296-4f44-b010-b3abec9f6b98" containerName="route-controller-manager" containerID="cri-o://1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c" gracePeriod=30 Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863330 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mzhb4"] Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863659 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863678 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863696 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863704 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863717 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863725 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863734 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863741 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863750 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863760 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863770 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863779 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863788 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863796 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863806 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863813 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863821 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863831 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863840 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863847 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863857 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863864 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863873 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863879 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863892 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863898 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863908 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" containerName="installer" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863915 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" containerName="installer" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863924 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863932 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864042 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864058 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864072 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" containerName="installer" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864082 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864095 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864103 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864113 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864649 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.867298 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.868330 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.868520 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.869248 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.880728 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mzhb4"] Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.883056 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.953888 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.953967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcj67\" (UniqueName: \"kubernetes.io/projected/34e2f7a3-366d-4817-a502-720b5f9a782e-kube-api-access-jcj67\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.954169 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.989433 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52"] Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.990204 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.993486 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.993690 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.998103 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52"] Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.055027 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.055088 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.055133 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcj67\" (UniqueName: \"kubernetes.io/projected/34e2f7a3-366d-4817-a502-720b5f9a782e-kube-api-access-jcj67\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.056696 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.066732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.080886 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcj67\" (UniqueName: \"kubernetes.io/projected/34e2f7a3-366d-4817-a502-720b5f9a782e-kube-api-access-jcj67\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.156102 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-279kj\" (UniqueName: \"kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.156151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.156176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.192454 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.259491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-279kj\" (UniqueName: \"kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.259551 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.259620 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.266689 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.267802 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.290053 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-279kj\" (UniqueName: \"kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.329374 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.330999 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.383445 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461476 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config\") pod \"6adbe475-48f9-4ba3-82bd-b36bcd939168\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461602 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert\") pod \"6adbe475-48f9-4ba3-82bd-b36bcd939168\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461644 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca\") pod \"6adbe475-48f9-4ba3-82bd-b36bcd939168\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461671 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config\") pod \"8da2028c-f296-4f44-b010-b3abec9f6b98\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461708 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-945rx\" (UniqueName: \"kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx\") pod \"8da2028c-f296-4f44-b010-b3abec9f6b98\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461728 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca\") pod \"8da2028c-f296-4f44-b010-b3abec9f6b98\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461753 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles\") pod \"6adbe475-48f9-4ba3-82bd-b36bcd939168\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461841 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4\") pod \"6adbe475-48f9-4ba3-82bd-b36bcd939168\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461871 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert\") pod \"8da2028c-f296-4f44-b010-b3abec9f6b98\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.462955 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6adbe475-48f9-4ba3-82bd-b36bcd939168" (UID: "6adbe475-48f9-4ba3-82bd-b36bcd939168"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.463875 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca" (OuterVolumeSpecName: "client-ca") pod "8da2028c-f296-4f44-b010-b3abec9f6b98" (UID: "8da2028c-f296-4f44-b010-b3abec9f6b98"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.464268 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca" (OuterVolumeSpecName: "client-ca") pod "6adbe475-48f9-4ba3-82bd-b36bcd939168" (UID: "6adbe475-48f9-4ba3-82bd-b36bcd939168"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.464443 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config" (OuterVolumeSpecName: "config") pod "6adbe475-48f9-4ba3-82bd-b36bcd939168" (UID: "6adbe475-48f9-4ba3-82bd-b36bcd939168"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.465713 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx" (OuterVolumeSpecName: "kube-api-access-945rx") pod "8da2028c-f296-4f44-b010-b3abec9f6b98" (UID: "8da2028c-f296-4f44-b010-b3abec9f6b98"). InnerVolumeSpecName "kube-api-access-945rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.466881 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6adbe475-48f9-4ba3-82bd-b36bcd939168" (UID: "6adbe475-48f9-4ba3-82bd-b36bcd939168"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.467835 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8da2028c-f296-4f44-b010-b3abec9f6b98" (UID: "8da2028c-f296-4f44-b010-b3abec9f6b98"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.468061 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4" (OuterVolumeSpecName: "kube-api-access-zqhd4") pod "6adbe475-48f9-4ba3-82bd-b36bcd939168" (UID: "6adbe475-48f9-4ba3-82bd-b36bcd939168"). InnerVolumeSpecName "kube-api-access-zqhd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.470819 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config" (OuterVolumeSpecName: "config") pod "8da2028c-f296-4f44-b010-b3abec9f6b98" (UID: "8da2028c-f296-4f44-b010-b3abec9f6b98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.564720 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-945rx\" (UniqueName: \"kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565151 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565166 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565176 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565186 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565196 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565207 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565217 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565226 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.584987 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mzhb4"] Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.630753 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52"] Feb 20 10:00:31 crc kubenswrapper[4962]: W0220 10:00:31.651715 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d2cbc3_9bc4_4270_9d26_66c3e9189f8e.slice/crio-68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd WatchSource:0}: Error finding container 68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd: Status 404 returned error can't find the container with id 68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.928791 4962 generic.go:334] "Generic (PLEG): container finished" podID="8da2028c-f296-4f44-b010-b3abec9f6b98" containerID="1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c" exitCode=0 Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.928849 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" event={"ID":"8da2028c-f296-4f44-b010-b3abec9f6b98","Type":"ContainerDied","Data":"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.930117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" event={"ID":"8da2028c-f296-4f44-b010-b3abec9f6b98","Type":"ContainerDied","Data":"c9ca7261143890db86b7247b8197f46263302fc4c677314a7e1a1eadf9f9acf2"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.928870 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.930201 4962 scope.go:117] "RemoveContainer" containerID="1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.931452 4962 generic.go:334] "Generic (PLEG): container finished" podID="d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" containerID="9e4ff8bca8b9c2f6e4f08722be3898de4e9890a93fcdc65b9b078dd1d1fbdae2" exitCode=0 Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.931560 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" event={"ID":"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e","Type":"ContainerDied","Data":"9e4ff8bca8b9c2f6e4f08722be3898de4e9890a93fcdc65b9b078dd1d1fbdae2"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.931650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" event={"ID":"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e","Type":"ContainerStarted","Data":"68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.935569 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" event={"ID":"34e2f7a3-366d-4817-a502-720b5f9a782e","Type":"ContainerStarted","Data":"b90b44b5bb302c758985e233e37a6262525fb60ab2dc8f60a3090a1ae4aed5b5"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.935611 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" event={"ID":"34e2f7a3-366d-4817-a502-720b5f9a782e","Type":"ContainerStarted","Data":"e29275785e3fcebab481819ca8f24356f98fcfcafcdf2365bd92827c75e2546a"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.936157 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.937891 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mzhb4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.937947 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" podUID="34e2f7a3-366d-4817-a502-720b5f9a782e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.941442 4962 generic.go:334] "Generic (PLEG): container finished" podID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerID="d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f" exitCode=0 Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.941509 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" event={"ID":"6adbe475-48f9-4ba3-82bd-b36bcd939168","Type":"ContainerDied","Data":"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.941571 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" event={"ID":"6adbe475-48f9-4ba3-82bd-b36bcd939168","Type":"ContainerDied","Data":"3bea97da1320becf13fecaed38868cc74c4f54c7308979ccb795e3bbe8eacf06"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.941655 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.968228 4962 scope.go:117] "RemoveContainer" containerID="1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c" Feb 20 10:00:31 crc kubenswrapper[4962]: E0220 10:00:31.968870 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c\": container with ID starting with 1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c not found: ID does not exist" containerID="1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.968912 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c"} err="failed to get container status \"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c\": rpc error: code = NotFound desc = could not find container \"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c\": container with ID starting with 1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c not found: ID does not exist" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.968943 4962 scope.go:117] "RemoveContainer" containerID="d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.970859 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" podStartSLOduration=1.970845373 podStartE2EDuration="1.970845373s" podCreationTimestamp="2026-02-20 10:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:00:31.967837348 +0000 UTC m=+323.550309204" watchObservedRunningTime="2026-02-20 10:00:31.970845373 +0000 UTC m=+323.553317239" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.983444 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.983577 4962 scope.go:117] "RemoveContainer" containerID="d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f" Feb 20 10:00:31 crc kubenswrapper[4962]: E0220 10:00:31.984160 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f\": container with ID starting with d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f not found: ID does not exist" containerID="d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.984235 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f"} err="failed to get container status \"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f\": rpc error: code = NotFound desc = could not find container \"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f\": container with ID starting with d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f not found: ID does not exist" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.985635 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.000569 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.009359 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.442301 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:00:32 crc kubenswrapper[4962]: E0220 10:00:32.442753 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da2028c-f296-4f44-b010-b3abec9f6b98" containerName="route-controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.442781 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da2028c-f296-4f44-b010-b3abec9f6b98" containerName="route-controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: E0220 10:00:32.442819 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerName="controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.442833 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerName="controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.442996 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da2028c-f296-4f44-b010-b3abec9f6b98" containerName="route-controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.443021 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerName="controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.443762 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.447367 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.447658 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.447898 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.448138 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.448630 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.448958 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.452994 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.453124 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.453242 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.453321 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.453627 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.455734 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.456129 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.463267 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.469057 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.471123 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.475451 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580134 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580200 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580389 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580637 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtp6w\" (UniqueName: \"kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580691 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580751 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580805 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv9rk\" (UniqueName: \"kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580945 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688379 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv9rk\" (UniqueName: \"kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688451 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688496 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688518 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688644 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtp6w\" (UniqueName: \"kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688718 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688886 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.690650 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.691068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.691020 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.693982 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.696150 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.697079 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.698834 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.707729 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv9rk\" (UniqueName: \"kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.708102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtp6w\" (UniqueName: \"kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.774655 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.784809 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.960742 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.096575 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.150452 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" path="/var/lib/kubelet/pods/6adbe475-48f9-4ba3-82bd-b36bcd939168/volumes" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.151640 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8da2028c-f296-4f44-b010-b3abec9f6b98" path="/var/lib/kubelet/pods/8da2028c-f296-4f44-b010-b3abec9f6b98/volumes" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.250315 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:00:33 crc kubenswrapper[4962]: W0220 10:00:33.255310 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c2178fa_96db_4c48_bbb2_b4533bb86944.slice/crio-d2adb538934c40a54615b86e80a9725c5a492f27096c6f2895982f89652fdbc8 WatchSource:0}: Error finding container d2adb538934c40a54615b86e80a9725c5a492f27096c6f2895982f89652fdbc8: Status 404 returned error can't find the container with id d2adb538934c40a54615b86e80a9725c5a492f27096c6f2895982f89652fdbc8 Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.260113 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.398134 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume\") pod \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.398252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279kj\" (UniqueName: \"kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj\") pod \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.398301 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume\") pod \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.399081 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume" (OuterVolumeSpecName: "config-volume") pod "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" (UID: "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.405355 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" (UID: "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.405793 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj" (OuterVolumeSpecName: "kube-api-access-279kj") pod "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" (UID: "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e"). InnerVolumeSpecName "kube-api-access-279kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.499744 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.499792 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279kj\" (UniqueName: \"kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.499808 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.960719 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" event={"ID":"2c2178fa-96db-4c48-bbb2-b4533bb86944","Type":"ContainerStarted","Data":"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5"} Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.961286 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" event={"ID":"2c2178fa-96db-4c48-bbb2-b4533bb86944","Type":"ContainerStarted","Data":"d2adb538934c40a54615b86e80a9725c5a492f27096c6f2895982f89652fdbc8"} Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.962752 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.963915 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" event={"ID":"801d9c9e-28d3-49dc-9db6-9818197a563a","Type":"ContainerStarted","Data":"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4"} Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.963944 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" event={"ID":"801d9c9e-28d3-49dc-9db6-9818197a563a","Type":"ContainerStarted","Data":"6a10639fda43ab810b743ff2fde0f7850126d9a967a46f22e12d36e472e668d8"} Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.964472 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.968214 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" event={"ID":"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e","Type":"ContainerDied","Data":"68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd"} Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.968253 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.968283 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.971957 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.972521 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.983841 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" podStartSLOduration=3.983811706 podStartE2EDuration="3.983811706s" podCreationTimestamp="2026-02-20 10:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:00:33.982880166 +0000 UTC m=+325.565352012" watchObservedRunningTime="2026-02-20 10:00:33.983811706 +0000 UTC m=+325.566283562" Feb 20 10:00:34 crc kubenswrapper[4962]: I0220 10:00:34.061443 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" podStartSLOduration=4.061411144 podStartE2EDuration="4.061411144s" podCreationTimestamp="2026-02-20 10:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:00:34.058176352 +0000 UTC m=+325.640648198" watchObservedRunningTime="2026-02-20 10:00:34.061411144 +0000 UTC m=+325.643883000" Feb 20 10:00:34 crc kubenswrapper[4962]: I0220 10:00:34.755388 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:35 crc kubenswrapper[4962]: I0220 10:00:35.983902 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" podUID="801d9c9e-28d3-49dc-9db6-9818197a563a" containerName="controller-manager" containerID="cri-o://eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4" gracePeriod=30 Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.452248 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.497457 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg"] Feb 20 10:00:36 crc kubenswrapper[4962]: E0220 10:00:36.497844 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" containerName="collect-profiles" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.497867 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" containerName="collect-profiles" Feb 20 10:00:36 crc kubenswrapper[4962]: E0220 10:00:36.497906 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801d9c9e-28d3-49dc-9db6-9818197a563a" containerName="controller-manager" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.497921 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="801d9c9e-28d3-49dc-9db6-9818197a563a" containerName="controller-manager" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.498096 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="801d9c9e-28d3-49dc-9db6-9818197a563a" containerName="controller-manager" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.498118 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" containerName="collect-profiles" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.499005 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.509233 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg"] Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.562968 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config\") pod \"801d9c9e-28d3-49dc-9db6-9818197a563a\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.563050 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert\") pod \"801d9c9e-28d3-49dc-9db6-9818197a563a\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.563095 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtp6w\" (UniqueName: \"kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w\") pod \"801d9c9e-28d3-49dc-9db6-9818197a563a\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.563238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca\") pod \"801d9c9e-28d3-49dc-9db6-9818197a563a\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.563268 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles\") pod \"801d9c9e-28d3-49dc-9db6-9818197a563a\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.564395 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "801d9c9e-28d3-49dc-9db6-9818197a563a" (UID: "801d9c9e-28d3-49dc-9db6-9818197a563a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.564838 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca" (OuterVolumeSpecName: "client-ca") pod "801d9c9e-28d3-49dc-9db6-9818197a563a" (UID: "801d9c9e-28d3-49dc-9db6-9818197a563a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.565089 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config" (OuterVolumeSpecName: "config") pod "801d9c9e-28d3-49dc-9db6-9818197a563a" (UID: "801d9c9e-28d3-49dc-9db6-9818197a563a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.573200 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "801d9c9e-28d3-49dc-9db6-9818197a563a" (UID: "801d9c9e-28d3-49dc-9db6-9818197a563a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.573800 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w" (OuterVolumeSpecName: "kube-api-access-jtp6w") pod "801d9c9e-28d3-49dc-9db6-9818197a563a" (UID: "801d9c9e-28d3-49dc-9db6-9818197a563a"). InnerVolumeSpecName "kube-api-access-jtp6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.665297 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5x5\" (UniqueName: \"kubernetes.io/projected/3d9d341c-6cc4-41ce-9d8c-2765a8950237-kube-api-access-sj5x5\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.665504 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-client-ca\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.665566 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-proxy-ca-bundles\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.665649 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-config\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666073 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9d341c-6cc4-41ce-9d8c-2765a8950237-serving-cert\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666343 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666374 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666404 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666423 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666442 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtp6w\" (UniqueName: \"kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.767382 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-client-ca\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.767439 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-proxy-ca-bundles\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.767469 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-config\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.767514 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9d341c-6cc4-41ce-9d8c-2765a8950237-serving-cert\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.767547 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5x5\" (UniqueName: \"kubernetes.io/projected/3d9d341c-6cc4-41ce-9d8c-2765a8950237-kube-api-access-sj5x5\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.769158 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-proxy-ca-bundles\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.769761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-client-ca\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.770128 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-config\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.773262 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9d341c-6cc4-41ce-9d8c-2765a8950237-serving-cert\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.788781 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5x5\" (UniqueName: \"kubernetes.io/projected/3d9d341c-6cc4-41ce-9d8c-2765a8950237-kube-api-access-sj5x5\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.824915 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.992777 4962 generic.go:334] "Generic (PLEG): container finished" podID="801d9c9e-28d3-49dc-9db6-9818197a563a" containerID="eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4" exitCode=0 Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.992833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" event={"ID":"801d9c9e-28d3-49dc-9db6-9818197a563a","Type":"ContainerDied","Data":"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4"} Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.992866 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" event={"ID":"801d9c9e-28d3-49dc-9db6-9818197a563a","Type":"ContainerDied","Data":"6a10639fda43ab810b743ff2fde0f7850126d9a967a46f22e12d36e472e668d8"} Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.992885 4962 scope.go:117] "RemoveContainer" containerID="eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.993013 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.042557 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.046563 4962 scope.go:117] "RemoveContainer" containerID="eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4" Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.047815 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:37 crc kubenswrapper[4962]: E0220 10:00:37.048516 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4\": container with ID starting with eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4 not found: ID does not exist" containerID="eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4" Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.048555 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4"} err="failed to get container status \"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4\": rpc error: code = NotFound desc = could not find container \"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4\": container with ID starting with eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4 not found: ID does not exist" Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.148051 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801d9c9e-28d3-49dc-9db6-9818197a563a" path="/var/lib/kubelet/pods/801d9c9e-28d3-49dc-9db6-9818197a563a/volumes" Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.342714 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg"] Feb 20 10:00:37 crc kubenswrapper[4962]: W0220 10:00:37.351385 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d9d341c_6cc4_41ce_9d8c_2765a8950237.slice/crio-966cbcd3db2e39b12a65ccae5ec7b69d63f96007c60e5e8653fe2d9ec63cddec WatchSource:0}: Error finding container 966cbcd3db2e39b12a65ccae5ec7b69d63f96007c60e5e8653fe2d9ec63cddec: Status 404 returned error can't find the container with id 966cbcd3db2e39b12a65ccae5ec7b69d63f96007c60e5e8653fe2d9ec63cddec Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.001010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" event={"ID":"3d9d341c-6cc4-41ce-9d8c-2765a8950237","Type":"ContainerStarted","Data":"b9a969d7ecc361cbf8df0410aa0c35284370b33497d064c68a5a69fd6cf61662"} Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.001095 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" event={"ID":"3d9d341c-6cc4-41ce-9d8c-2765a8950237","Type":"ContainerStarted","Data":"966cbcd3db2e39b12a65ccae5ec7b69d63f96007c60e5e8653fe2d9ec63cddec"} Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.003658 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.013508 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.029408 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" podStartSLOduration=4.029387548 podStartE2EDuration="4.029387548s" podCreationTimestamp="2026-02-20 10:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:00:38.025958149 +0000 UTC m=+329.608429995" watchObservedRunningTime="2026-02-20 10:00:38.029387548 +0000 UTC m=+329.611859394" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.230390 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lllvp"] Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.231258 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.252021 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lllvp"] Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391680 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-tls\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391750 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-certificates\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391777 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391809 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391830 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-bound-sa-token\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391903 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq4b4\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-kube-api-access-gq4b4\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391923 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391959 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-trusted-ca\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.417769 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.493689 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494231 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-bound-sa-token\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494288 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq4b4\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-kube-api-access-gq4b4\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494375 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-trusted-ca\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494419 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-tls\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494460 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-certificates\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494684 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.495938 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-trusted-ca\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.496200 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-certificates\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.502300 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-tls\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.502997 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.515096 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-bound-sa-token\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.522495 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq4b4\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-kube-api-access-gq4b4\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.594693 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:39 crc kubenswrapper[4962]: I0220 10:00:39.053338 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lllvp"] Feb 20 10:00:39 crc kubenswrapper[4962]: W0220 10:00:39.062798 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32b10f1_4fdb_4320_a7d9_6f70bbdc0929.slice/crio-63d799e1986d6c1df11ddfabeed19035418b8a3437eb932b329718727863cfb5 WatchSource:0}: Error finding container 63d799e1986d6c1df11ddfabeed19035418b8a3437eb932b329718727863cfb5: Status 404 returned error can't find the container with id 63d799e1986d6c1df11ddfabeed19035418b8a3437eb932b329718727863cfb5 Feb 20 10:00:40 crc kubenswrapper[4962]: I0220 10:00:40.020673 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" event={"ID":"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929","Type":"ContainerStarted","Data":"59cb24cb15e07642645b817d540415fe3edf712b8073107df5e2dc5eac04ab0e"} Feb 20 10:00:40 crc kubenswrapper[4962]: I0220 10:00:40.021163 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:40 crc kubenswrapper[4962]: I0220 10:00:40.021181 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" event={"ID":"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929","Type":"ContainerStarted","Data":"63d799e1986d6c1df11ddfabeed19035418b8a3437eb932b329718727863cfb5"} Feb 20 10:00:40 crc kubenswrapper[4962]: I0220 10:00:40.070250 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" podStartSLOduration=2.070213669 podStartE2EDuration="2.070213669s" podCreationTimestamp="2026-02-20 10:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:00:40.063295969 +0000 UTC m=+331.645767835" watchObservedRunningTime="2026-02-20 10:00:40.070213669 +0000 UTC m=+331.652685555" Feb 20 10:00:41 crc kubenswrapper[4962]: I0220 10:00:41.508553 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:00:41 crc kubenswrapper[4962]: I0220 10:00:41.509243 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.780830 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sl4km"] Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.782184 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.784865 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.797695 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sl4km"] Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.903476 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-utilities\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.903759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hrfs\" (UniqueName: \"kubernetes.io/projected/0e92c119-6503-4fc1-b607-0d41d821e8fe-kube-api-access-6hrfs\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.904068 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-catalog-content\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.981624 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j9hxw"] Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.982952 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.986812 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.994548 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j9hxw"] Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.005045 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-catalog-content\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.005105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-utilities\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.005147 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hrfs\" (UniqueName: \"kubernetes.io/projected/0e92c119-6503-4fc1-b607-0d41d821e8fe-kube-api-access-6hrfs\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.005965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-catalog-content\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.006080 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-utilities\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.033937 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hrfs\" (UniqueName: \"kubernetes.io/projected/0e92c119-6503-4fc1-b607-0d41d821e8fe-kube-api-access-6hrfs\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.106737 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-utilities\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.106787 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-catalog-content\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.106813 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw5bk\" (UniqueName: \"kubernetes.io/projected/82f8db6b-4715-42f3-a705-821af9e03156-kube-api-access-lw5bk\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.143876 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.209099 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-utilities\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.209188 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-catalog-content\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.209249 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw5bk\" (UniqueName: \"kubernetes.io/projected/82f8db6b-4715-42f3-a705-821af9e03156-kube-api-access-lw5bk\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.211089 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-utilities\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.211109 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-catalog-content\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.240126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw5bk\" (UniqueName: \"kubernetes.io/projected/82f8db6b-4715-42f3-a705-821af9e03156-kube-api-access-lw5bk\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.297259 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.597527 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sl4km"] Feb 20 10:00:51 crc kubenswrapper[4962]: W0220 10:00:51.602912 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e92c119_6503_4fc1_b607_0d41d821e8fe.slice/crio-1857f27633858ee0c5e1a842d7a2fe0ec51450fe6fc9bd8681879479cae311e4 WatchSource:0}: Error finding container 1857f27633858ee0c5e1a842d7a2fe0ec51450fe6fc9bd8681879479cae311e4: Status 404 returned error can't find the container with id 1857f27633858ee0c5e1a842d7a2fe0ec51450fe6fc9bd8681879479cae311e4 Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.701111 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j9hxw"] Feb 20 10:00:51 crc kubenswrapper[4962]: W0220 10:00:51.707941 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82f8db6b_4715_42f3_a705_821af9e03156.slice/crio-4426c63fbc54fbb5e8817d622150574fccbd11ccca180ef6d535d5f7774da999 WatchSource:0}: Error finding container 4426c63fbc54fbb5e8817d622150574fccbd11ccca180ef6d535d5f7774da999: Status 404 returned error can't find the container with id 4426c63fbc54fbb5e8817d622150574fccbd11ccca180ef6d535d5f7774da999 Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.107729 4962 generic.go:334] "Generic (PLEG): container finished" podID="0e92c119-6503-4fc1-b607-0d41d821e8fe" containerID="75b1b6573c381433cde8a7b8d56c45183daa17d6959a43902cbb5b72476056fc" exitCode=0 Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.107796 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl4km" event={"ID":"0e92c119-6503-4fc1-b607-0d41d821e8fe","Type":"ContainerDied","Data":"75b1b6573c381433cde8a7b8d56c45183daa17d6959a43902cbb5b72476056fc"} Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.108138 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl4km" event={"ID":"0e92c119-6503-4fc1-b607-0d41d821e8fe","Type":"ContainerStarted","Data":"1857f27633858ee0c5e1a842d7a2fe0ec51450fe6fc9bd8681879479cae311e4"} Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.112783 4962 generic.go:334] "Generic (PLEG): container finished" podID="82f8db6b-4715-42f3-a705-821af9e03156" containerID="62f673ffb2ae5bb54aa9b1b375a914a7b13750b5fd842a4abf93abeb1bbb0f43" exitCode=0 Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.112831 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9hxw" event={"ID":"82f8db6b-4715-42f3-a705-821af9e03156","Type":"ContainerDied","Data":"62f673ffb2ae5bb54aa9b1b375a914a7b13750b5fd842a4abf93abeb1bbb0f43"} Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.112867 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9hxw" event={"ID":"82f8db6b-4715-42f3-a705-821af9e03156","Type":"ContainerStarted","Data":"4426c63fbc54fbb5e8817d622150574fccbd11ccca180ef6d535d5f7774da999"} Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.121166 4962 generic.go:334] "Generic (PLEG): container finished" podID="0e92c119-6503-4fc1-b607-0d41d821e8fe" containerID="768ffc49e7fd2e11d9b986aa928149b738fe25633e0327c75baf58497f044efd" exitCode=0 Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.121747 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl4km" event={"ID":"0e92c119-6503-4fc1-b607-0d41d821e8fe","Type":"ContainerDied","Data":"768ffc49e7fd2e11d9b986aa928149b738fe25633e0327c75baf58497f044efd"} Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.176040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9hxw" event={"ID":"82f8db6b-4715-42f3-a705-821af9e03156","Type":"ContainerStarted","Data":"5b3dc93152a6aa3e4d4bdd5da4281efd77087dffbfeb56667dea0c10ca907dd6"} Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.194425 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.195618 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.210089 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.229845 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.345871 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gmr\" (UniqueName: \"kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.345987 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.346043 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.387739 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lsx57"] Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.389173 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.391921 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.393021 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsx57"] Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.447557 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.447638 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.447715 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66gmr\" (UniqueName: \"kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.448017 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.449067 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.468176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gmr\" (UniqueName: \"kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.537640 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.550351 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-utilities\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.550411 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7bf\" (UniqueName: \"kubernetes.io/projected/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-kube-api-access-bg7bf\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.550762 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-catalog-content\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.652623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-catalog-content\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.653064 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-utilities\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.653092 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7bf\" (UniqueName: \"kubernetes.io/projected/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-kube-api-access-bg7bf\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.653474 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-catalog-content\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.653952 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-utilities\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.672660 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7bf\" (UniqueName: \"kubernetes.io/projected/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-kube-api-access-bg7bf\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.810830 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.985028 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 10:00:53 crc kubenswrapper[4962]: W0220 10:00:53.990320 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda63e8904_d4b9_405f_94a1_f44cb565b3e7.slice/crio-8a31164fd2d499337255d6b6c8ff41059d39c8ca1c0d9b36dc4180dcc63a2f70 WatchSource:0}: Error finding container 8a31164fd2d499337255d6b6c8ff41059d39c8ca1c0d9b36dc4180dcc63a2f70: Status 404 returned error can't find the container with id 8a31164fd2d499337255d6b6c8ff41059d39c8ca1c0d9b36dc4180dcc63a2f70 Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.185791 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl4km" event={"ID":"0e92c119-6503-4fc1-b607-0d41d821e8fe","Type":"ContainerStarted","Data":"f151c6c6ae9163db4285b46a590a250b44e12af14e3ee33e2dacf43e63c1f999"} Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.187508 4962 generic.go:334] "Generic (PLEG): container finished" podID="82f8db6b-4715-42f3-a705-821af9e03156" containerID="5b3dc93152a6aa3e4d4bdd5da4281efd77087dffbfeb56667dea0c10ca907dd6" exitCode=0 Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.187594 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9hxw" event={"ID":"82f8db6b-4715-42f3-a705-821af9e03156","Type":"ContainerDied","Data":"5b3dc93152a6aa3e4d4bdd5da4281efd77087dffbfeb56667dea0c10ca907dd6"} Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.189470 4962 generic.go:334] "Generic (PLEG): container finished" podID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerID="0dadcfbc9540e03300cad39a3785cea06d52c326b71fdd2b10314f009df918de" exitCode=0 Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.189511 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerDied","Data":"0dadcfbc9540e03300cad39a3785cea06d52c326b71fdd2b10314f009df918de"} Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.189541 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerStarted","Data":"8a31164fd2d499337255d6b6c8ff41059d39c8ca1c0d9b36dc4180dcc63a2f70"} Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.214519 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sl4km" podStartSLOduration=2.713011942 podStartE2EDuration="4.214493209s" podCreationTimestamp="2026-02-20 10:00:50 +0000 UTC" firstStartedPulling="2026-02-20 10:00:52.10919236 +0000 UTC m=+343.691664226" lastFinishedPulling="2026-02-20 10:00:53.610673647 +0000 UTC m=+345.193145493" observedRunningTime="2026-02-20 10:00:54.211524304 +0000 UTC m=+345.793996160" watchObservedRunningTime="2026-02-20 10:00:54.214493209 +0000 UTC m=+345.796965065" Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.241785 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsx57"] Feb 20 10:00:54 crc kubenswrapper[4962]: W0220 10:00:54.245951 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16b4ee5b_87f1_4b91_abd0_d2a7eb56e7bd.slice/crio-f5ac4dc3c6b8ac793d2e417c77fdfae4619ef7c03650d5b8771f8bfb483d0c5b WatchSource:0}: Error finding container f5ac4dc3c6b8ac793d2e417c77fdfae4619ef7c03650d5b8771f8bfb483d0c5b: Status 404 returned error can't find the container with id f5ac4dc3c6b8ac793d2e417c77fdfae4619ef7c03650d5b8771f8bfb483d0c5b Feb 20 10:00:55 crc kubenswrapper[4962]: I0220 10:00:55.200450 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9hxw" event={"ID":"82f8db6b-4715-42f3-a705-821af9e03156","Type":"ContainerStarted","Data":"63089f43c76a2f21cbce4a8383f7399bc851fc4bd66badd6f54366ce687b26c8"} Feb 20 10:00:55 crc kubenswrapper[4962]: I0220 10:00:55.203476 4962 generic.go:334] "Generic (PLEG): container finished" podID="16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd" containerID="05f8b028cb076a04d157dea166964f8cd1ad3e2d4cb618244e94ea9a474f4dbd" exitCode=0 Feb 20 10:00:55 crc kubenswrapper[4962]: I0220 10:00:55.203570 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsx57" event={"ID":"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd","Type":"ContainerDied","Data":"05f8b028cb076a04d157dea166964f8cd1ad3e2d4cb618244e94ea9a474f4dbd"} Feb 20 10:00:55 crc kubenswrapper[4962]: I0220 10:00:55.203657 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsx57" event={"ID":"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd","Type":"ContainerStarted","Data":"f5ac4dc3c6b8ac793d2e417c77fdfae4619ef7c03650d5b8771f8bfb483d0c5b"} Feb 20 10:00:55 crc kubenswrapper[4962]: I0220 10:00:55.218645 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j9hxw" podStartSLOduration=2.7043437150000003 podStartE2EDuration="5.218626336s" podCreationTimestamp="2026-02-20 10:00:50 +0000 UTC" firstStartedPulling="2026-02-20 10:00:52.115625304 +0000 UTC m=+343.698097150" lastFinishedPulling="2026-02-20 10:00:54.629907925 +0000 UTC m=+346.212379771" observedRunningTime="2026-02-20 10:00:55.216982625 +0000 UTC m=+346.799454471" watchObservedRunningTime="2026-02-20 10:00:55.218626336 +0000 UTC m=+346.801098182" Feb 20 10:00:56 crc kubenswrapper[4962]: I0220 10:00:56.211736 4962 generic.go:334] "Generic (PLEG): container finished" podID="16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd" containerID="3cd190a0c2fc6a4735fdad7f6ba7fdb0f7749184990d46ef96f36b2f1568c407" exitCode=0 Feb 20 10:00:56 crc kubenswrapper[4962]: I0220 10:00:56.211815 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsx57" event={"ID":"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd","Type":"ContainerDied","Data":"3cd190a0c2fc6a4735fdad7f6ba7fdb0f7749184990d46ef96f36b2f1568c407"} Feb 20 10:00:56 crc kubenswrapper[4962]: I0220 10:00:56.213877 4962 generic.go:334] "Generic (PLEG): container finished" podID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerID="4a3a18f226977c3365d615b122597c40567fbf4037342852763a1652d9c44e94" exitCode=0 Feb 20 10:00:56 crc kubenswrapper[4962]: I0220 10:00:56.213935 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerDied","Data":"4a3a18f226977c3365d615b122597c40567fbf4037342852763a1652d9c44e94"} Feb 20 10:00:57 crc kubenswrapper[4962]: I0220 10:00:57.219954 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerStarted","Data":"89e68b25becf346a76d704d66f2fa088754a410ce947cc372fb175cd6ff921ab"} Feb 20 10:00:57 crc kubenswrapper[4962]: I0220 10:00:57.223020 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsx57" event={"ID":"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd","Type":"ContainerStarted","Data":"7137b7222dc34e33b4b934560f10e0f159557bbaac8ab7183eaa47b62b1b2dd5"} Feb 20 10:00:57 crc kubenswrapper[4962]: I0220 10:00:57.245124 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v7zlm" podStartSLOduration=1.842846029 podStartE2EDuration="4.245103951s" podCreationTimestamp="2026-02-20 10:00:53 +0000 UTC" firstStartedPulling="2026-02-20 10:00:54.190718323 +0000 UTC m=+345.773190179" lastFinishedPulling="2026-02-20 10:00:56.592976255 +0000 UTC m=+348.175448101" observedRunningTime="2026-02-20 10:00:57.2409529 +0000 UTC m=+348.823424746" watchObservedRunningTime="2026-02-20 10:00:57.245103951 +0000 UTC m=+348.827575797" Feb 20 10:00:57 crc kubenswrapper[4962]: I0220 10:00:57.263817 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lsx57" podStartSLOduration=2.860559978 podStartE2EDuration="4.263798895s" podCreationTimestamp="2026-02-20 10:00:53 +0000 UTC" firstStartedPulling="2026-02-20 10:00:55.204969503 +0000 UTC m=+346.787441349" lastFinishedPulling="2026-02-20 10:00:56.60820842 +0000 UTC m=+348.190680266" observedRunningTime="2026-02-20 10:00:57.260417998 +0000 UTC m=+348.842889844" watchObservedRunningTime="2026-02-20 10:00:57.263798895 +0000 UTC m=+348.846270741" Feb 20 10:00:58 crc kubenswrapper[4962]: I0220 10:00:58.602791 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:58 crc kubenswrapper[4962]: I0220 10:00:58.678441 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.145230 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.145597 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.181412 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.292521 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.298440 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.298481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.348427 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:01:02 crc kubenswrapper[4962]: I0220 10:01:02.313107 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.538053 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.538481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.612417 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.811732 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.811791 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.876946 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:01:04 crc kubenswrapper[4962]: I0220 10:01:04.320942 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:01:04 crc kubenswrapper[4962]: I0220 10:01:04.329835 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:01:11 crc kubenswrapper[4962]: I0220 10:01:11.508266 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:01:11 crc kubenswrapper[4962]: I0220 10:01:11.508742 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:01:23 crc kubenswrapper[4962]: I0220 10:01:23.721516 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" podUID="b4ad1819-20e1-406b-8499-5a73780c0a0c" containerName="registry" containerID="cri-o://c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46" gracePeriod=30 Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.239939 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333100 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333320 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333369 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333385 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmtxm\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333472 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.334120 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.334151 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.334392 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.335141 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.339363 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.340105 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.340664 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm" (OuterVolumeSpecName: "kube-api-access-gmtxm") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "kube-api-access-gmtxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.340957 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.347798 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.360759 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.373747 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4ad1819-20e1-406b-8499-5a73780c0a0c" containerID="c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46" exitCode=0 Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.373797 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.373818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" event={"ID":"b4ad1819-20e1-406b-8499-5a73780c0a0c","Type":"ContainerDied","Data":"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46"} Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.374143 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" event={"ID":"b4ad1819-20e1-406b-8499-5a73780c0a0c","Type":"ContainerDied","Data":"5004d974da71f7174ba7d6f42652143c4f7cb0b752e3647e653cb9e55b56d9b3"} Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.374176 4962 scope.go:117] "RemoveContainer" containerID="c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.402639 4962 scope.go:117] "RemoveContainer" containerID="c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46" Feb 20 10:01:24 crc kubenswrapper[4962]: E0220 10:01:24.403551 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46\": container with ID starting with c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46 not found: ID does not exist" containerID="c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.403803 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46"} err="failed to get container status \"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46\": rpc error: code = NotFound desc = could not find container \"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46\": container with ID starting with c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46 not found: ID does not exist" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.418149 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.424961 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.435910 4962 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436068 4962 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436177 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436312 4962 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436404 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436490 4962 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436623 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmtxm\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:25 crc kubenswrapper[4962]: I0220 10:01:25.153591 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4ad1819-20e1-406b-8499-5a73780c0a0c" path="/var/lib/kubelet/pods/b4ad1819-20e1-406b-8499-5a73780c0a0c/volumes" Feb 20 10:01:34 crc kubenswrapper[4962]: I0220 10:01:34.736287 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:01:34 crc kubenswrapper[4962]: I0220 10:01:34.737499 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" podUID="2c2178fa-96db-4c48-bbb2-b4533bb86944" containerName="route-controller-manager" containerID="cri-o://4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5" gracePeriod=30 Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.138146 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.314699 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca\") pod \"2c2178fa-96db-4c48-bbb2-b4533bb86944\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.314925 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv9rk\" (UniqueName: \"kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk\") pod \"2c2178fa-96db-4c48-bbb2-b4533bb86944\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.315066 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config\") pod \"2c2178fa-96db-4c48-bbb2-b4533bb86944\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.315829 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert\") pod \"2c2178fa-96db-4c48-bbb2-b4533bb86944\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.316703 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca" (OuterVolumeSpecName: "client-ca") pod "2c2178fa-96db-4c48-bbb2-b4533bb86944" (UID: "2c2178fa-96db-4c48-bbb2-b4533bb86944"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.316731 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config" (OuterVolumeSpecName: "config") pod "2c2178fa-96db-4c48-bbb2-b4533bb86944" (UID: "2c2178fa-96db-4c48-bbb2-b4533bb86944"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.321719 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c2178fa-96db-4c48-bbb2-b4533bb86944" (UID: "2c2178fa-96db-4c48-bbb2-b4533bb86944"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.322624 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk" (OuterVolumeSpecName: "kube-api-access-hv9rk") pod "2c2178fa-96db-4c48-bbb2-b4533bb86944" (UID: "2c2178fa-96db-4c48-bbb2-b4533bb86944"). InnerVolumeSpecName "kube-api-access-hv9rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.417944 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.418028 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.418051 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv9rk\" (UniqueName: \"kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.418077 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.447123 4962 generic.go:334] "Generic (PLEG): container finished" podID="2c2178fa-96db-4c48-bbb2-b4533bb86944" containerID="4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5" exitCode=0 Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.447206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" event={"ID":"2c2178fa-96db-4c48-bbb2-b4533bb86944","Type":"ContainerDied","Data":"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5"} Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.447315 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.447347 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" event={"ID":"2c2178fa-96db-4c48-bbb2-b4533bb86944","Type":"ContainerDied","Data":"d2adb538934c40a54615b86e80a9725c5a492f27096c6f2895982f89652fdbc8"} Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.447399 4962 scope.go:117] "RemoveContainer" containerID="4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.478444 4962 scope.go:117] "RemoveContainer" containerID="4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5" Feb 20 10:01:35 crc kubenswrapper[4962]: E0220 10:01:35.479216 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5\": container with ID starting with 4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5 not found: ID does not exist" containerID="4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.479312 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5"} err="failed to get container status \"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5\": rpc error: code = NotFound desc = could not find container \"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5\": container with ID starting with 4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5 not found: ID does not exist" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.496504 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.503121 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.484433 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8"] Feb 20 10:01:36 crc kubenswrapper[4962]: E0220 10:01:36.484690 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ad1819-20e1-406b-8499-5a73780c0a0c" containerName="registry" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.484706 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ad1819-20e1-406b-8499-5a73780c0a0c" containerName="registry" Feb 20 10:01:36 crc kubenswrapper[4962]: E0220 10:01:36.484730 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2178fa-96db-4c48-bbb2-b4533bb86944" containerName="route-controller-manager" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.484737 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2178fa-96db-4c48-bbb2-b4533bb86944" containerName="route-controller-manager" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.484851 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2178fa-96db-4c48-bbb2-b4533bb86944" containerName="route-controller-manager" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.484859 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ad1819-20e1-406b-8499-5a73780c0a0c" containerName="registry" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.485231 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.488060 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.488791 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.488982 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.489065 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.488989 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.491109 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.498616 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8"] Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.634208 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-client-ca\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.634372 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a393401-cb35-4a65-9be1-cb3956d6b44a-serving-cert\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.634472 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-config\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.634537 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs5g2\" (UniqueName: \"kubernetes.io/projected/7a393401-cb35-4a65-9be1-cb3956d6b44a-kube-api-access-zs5g2\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.736872 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs5g2\" (UniqueName: \"kubernetes.io/projected/7a393401-cb35-4a65-9be1-cb3956d6b44a-kube-api-access-zs5g2\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.736991 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-client-ca\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.737059 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a393401-cb35-4a65-9be1-cb3956d6b44a-serving-cert\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.737120 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-config\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.739161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-client-ca\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.739280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-config\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.752847 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a393401-cb35-4a65-9be1-cb3956d6b44a-serving-cert\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.770203 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs5g2\" (UniqueName: \"kubernetes.io/projected/7a393401-cb35-4a65-9be1-cb3956d6b44a-kube-api-access-zs5g2\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.810214 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.081933 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8"] Feb 20 10:01:37 crc kubenswrapper[4962]: W0220 10:01:37.093133 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a393401_cb35_4a65_9be1_cb3956d6b44a.slice/crio-2b98ebe61556d954740f52fcd08e56f5013a6769dc73cf8badb3f9129c05b9d6 WatchSource:0}: Error finding container 2b98ebe61556d954740f52fcd08e56f5013a6769dc73cf8badb3f9129c05b9d6: Status 404 returned error can't find the container with id 2b98ebe61556d954740f52fcd08e56f5013a6769dc73cf8badb3f9129c05b9d6 Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.146342 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2178fa-96db-4c48-bbb2-b4533bb86944" path="/var/lib/kubelet/pods/2c2178fa-96db-4c48-bbb2-b4533bb86944/volumes" Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.471826 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" event={"ID":"7a393401-cb35-4a65-9be1-cb3956d6b44a","Type":"ContainerStarted","Data":"057587bac126991d0fab9a799fe34d5777fe6a028016633e8281ad0b5a6efe21"} Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.471871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" event={"ID":"7a393401-cb35-4a65-9be1-cb3956d6b44a","Type":"ContainerStarted","Data":"2b98ebe61556d954740f52fcd08e56f5013a6769dc73cf8badb3f9129c05b9d6"} Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.472185 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.497340 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" podStartSLOduration=3.497321296 podStartE2EDuration="3.497321296s" podCreationTimestamp="2026-02-20 10:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:01:37.496364236 +0000 UTC m=+389.078836122" watchObservedRunningTime="2026-02-20 10:01:37.497321296 +0000 UTC m=+389.079793142" Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.623633 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:41 crc kubenswrapper[4962]: I0220 10:01:41.508239 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:01:41 crc kubenswrapper[4962]: I0220 10:01:41.509092 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:01:41 crc kubenswrapper[4962]: I0220 10:01:41.510416 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:01:41 crc kubenswrapper[4962]: I0220 10:01:41.511773 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:01:41 crc kubenswrapper[4962]: I0220 10:01:41.511886 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa" gracePeriod=600 Feb 20 10:01:42 crc kubenswrapper[4962]: I0220 10:01:42.515117 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa" exitCode=0 Feb 20 10:01:42 crc kubenswrapper[4962]: I0220 10:01:42.515244 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa"} Feb 20 10:01:42 crc kubenswrapper[4962]: I0220 10:01:42.515792 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838"} Feb 20 10:01:42 crc kubenswrapper[4962]: I0220 10:01:42.515831 4962 scope.go:117] "RemoveContainer" containerID="dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432" Feb 20 10:03:41 crc kubenswrapper[4962]: I0220 10:03:41.508722 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:03:41 crc kubenswrapper[4962]: I0220 10:03:41.509828 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:04:11 crc kubenswrapper[4962]: I0220 10:04:11.508842 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:04:11 crc kubenswrapper[4962]: I0220 10:04:11.509534 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.508863 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.510880 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.510974 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.511620 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.511680 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838" gracePeriod=600 Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.770579 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838" exitCode=0 Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.770642 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838"} Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.770676 4962 scope.go:117] "RemoveContainer" containerID="28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa" Feb 20 10:04:42 crc kubenswrapper[4962]: I0220 10:04:42.778769 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14"} Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.900186 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-99b2s"] Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901581 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-controller" containerID="cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901787 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="northd" containerID="cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901735 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="nbdb" containerID="cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901843 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901895 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-node" containerID="cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901939 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-acl-logging" containerID="cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901853 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="sbdb" containerID="cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.947054 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" containerID="cri-o://632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" gracePeriod=30 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.257211 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/3.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.259871 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovn-acl-logging/0.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.260239 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovn-controller/0.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.260574 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.330778 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z6xc2"] Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331292 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="nbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331372 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="nbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331432 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331480 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331523 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331570 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331649 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-node" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331701 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-node" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331769 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331844 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331897 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-acl-logging" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331946 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-acl-logging" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.332000 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kubecfg-setup" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.332055 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kubecfg-setup" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.332105 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="northd" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.332154 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="northd" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.332280 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.332336 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.332684 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="sbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.332755 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="sbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.332811 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.332860 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333100 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-node" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333160 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="sbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333212 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333257 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-acl-logging" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333307 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333359 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="nbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333410 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333482 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="northd" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333630 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333690 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333738 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333799 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.333961 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.334015 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.334243 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.334297 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.335907 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361484 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovn-node-metrics-cert\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361539 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgsdk\" (UniqueName: \"kubernetes.io/projected/e12256fd-84a5-4a79-b750-20b5a64bd4c9-kube-api-access-jgsdk\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361579 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-script-lib\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361639 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361672 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-ovn\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361698 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-netns\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361730 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-slash\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361776 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-env-overrides\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361812 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-config\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361936 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-systemd-units\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361994 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-etc-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362173 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-bin\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362207 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362265 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-netd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362326 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-systemd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362403 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-log-socket\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362451 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-kubelet\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362500 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-var-lib-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-node-log\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.458239 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/2.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.459005 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/1.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.459093 4962 generic.go:334] "Generic (PLEG): container finished" podID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" containerID="1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef" exitCode=2 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.459194 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerDied","Data":"1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.459261 4962 scope.go:117] "RemoveContainer" containerID="330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.460015 4962 scope.go:117] "RemoveContainer" containerID="1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.460311 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wqwgj_openshift-multus(1957ac70-30f9-48c2-a82b-72aa3b7a883a)\"" pod="openshift-multus/multus-wqwgj" podUID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.461710 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/3.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.462880 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.462928 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.462990 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463026 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463110 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463517 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463902 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463991 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463972 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.464107 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.464794 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.465716 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85mbt\" (UniqueName: \"kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.465810 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.465857 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.465908 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.465947 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466003 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466104 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466125 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466172 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466217 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466233 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash" (OuterVolumeSpecName: "host-slash") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466271 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466314 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466367 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466404 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-env-overrides\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466732 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-config\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-systemd-units\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466885 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-etc-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466949 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467070 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-bin\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467115 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-netd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467226 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-systemd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467269 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-log-socket\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467307 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-kubelet\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467341 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-var-lib-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467408 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-node-log\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467455 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovn-node-metrics-cert\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467489 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgsdk\" (UniqueName: \"kubernetes.io/projected/e12256fd-84a5-4a79-b750-20b5a64bd4c9-kube-api-access-jgsdk\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467526 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-script-lib\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467619 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467658 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-netns\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467691 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-ovn\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467728 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-slash\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467812 4962 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467833 4962 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467853 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467874 4962 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467894 4962 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467913 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467933 4962 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467951 4962 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468031 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-slash\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468733 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-bin\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468823 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-systemd-units\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468995 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovn-acl-logging/0.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469173 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469362 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-etc-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469481 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469547 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-netd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469563 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-config\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469624 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-systemd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469671 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-log-socket\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469714 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-kubelet\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469758 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-var-lib-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469445 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-node-log\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470145 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-env-overrides\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470163 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovn-controller/0.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466293 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468104 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468137 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468167 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log" (OuterVolumeSpecName: "node-log") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468195 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469633 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket" (OuterVolumeSpecName: "log-socket") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469658 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469680 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470336 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-netns\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470380 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470409 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470487 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-ovn\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.471863 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-script-lib\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472117 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472161 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472195 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472170 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472329 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472317 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472422 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472445 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472464 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472482 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" exitCode=143 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472549 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" exitCode=143 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472584 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472657 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472677 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472691 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472703 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472715 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472726 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472738 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472749 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472760 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472772 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472790 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472807 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472821 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472832 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472843 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472855 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472866 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472877 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472891 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472904 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472915 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472931 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472948 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472961 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472973 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472985 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472996 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473007 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473019 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473030 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473041 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473055 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473071 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"30d1769bf1e4a85341ca0d75e37166ad7a768dbf64ad246e32c8fde99616e4b7"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473091 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473105 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473117 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473128 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473140 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473151 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473162 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473173 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473185 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473196 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473899 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovn-node-metrics-cert\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.476400 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.478686 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt" (OuterVolumeSpecName: "kube-api-access-85mbt") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "kube-api-access-85mbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.497870 4962 scope.go:117] "RemoveContainer" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.504432 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgsdk\" (UniqueName: \"kubernetes.io/projected/e12256fd-84a5-4a79-b750-20b5a64bd4c9-kube-api-access-jgsdk\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.504515 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.526783 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.547335 4962 scope.go:117] "RemoveContainer" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569201 4962 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569451 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85mbt\" (UniqueName: \"kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569618 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569721 4962 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569838 4962 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569938 4962 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569958 4962 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569974 4962 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569989 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.570004 4962 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.570016 4962 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.570029 4962 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.574154 4962 scope.go:117] "RemoveContainer" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.600722 4962 scope.go:117] "RemoveContainer" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.622420 4962 scope.go:117] "RemoveContainer" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.646758 4962 scope.go:117] "RemoveContainer" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.654395 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.665559 4962 scope.go:117] "RemoveContainer" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.684987 4962 scope.go:117] "RemoveContainer" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.715070 4962 scope.go:117] "RemoveContainer" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.735676 4962 scope.go:117] "RemoveContainer" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.736226 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": container with ID starting with 632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af not found: ID does not exist" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.736352 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} err="failed to get container status \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": rpc error: code = NotFound desc = could not find container \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": container with ID starting with 632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.736454 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.736970 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": container with ID starting with 0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add not found: ID does not exist" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737013 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} err="failed to get container status \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": rpc error: code = NotFound desc = could not find container \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": container with ID starting with 0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737045 4962 scope.go:117] "RemoveContainer" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.737391 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": container with ID starting with 195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8 not found: ID does not exist" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737421 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} err="failed to get container status \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": rpc error: code = NotFound desc = could not find container \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": container with ID starting with 195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737442 4962 scope.go:117] "RemoveContainer" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.737822 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": container with ID starting with 50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378 not found: ID does not exist" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737884 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} err="failed to get container status \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": rpc error: code = NotFound desc = could not find container \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": container with ID starting with 50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737923 4962 scope.go:117] "RemoveContainer" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.738341 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": container with ID starting with 582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584 not found: ID does not exist" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.738379 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} err="failed to get container status \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": rpc error: code = NotFound desc = could not find container \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": container with ID starting with 582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.738402 4962 scope.go:117] "RemoveContainer" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.738747 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": container with ID starting with 9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f not found: ID does not exist" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.738778 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} err="failed to get container status \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": rpc error: code = NotFound desc = could not find container \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": container with ID starting with 9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.738798 4962 scope.go:117] "RemoveContainer" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.739081 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": container with ID starting with 03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f not found: ID does not exist" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739109 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} err="failed to get container status \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": rpc error: code = NotFound desc = could not find container \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": container with ID starting with 03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739126 4962 scope.go:117] "RemoveContainer" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.739404 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": container with ID starting with 36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a not found: ID does not exist" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739429 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} err="failed to get container status \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": rpc error: code = NotFound desc = could not find container \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": container with ID starting with 36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739452 4962 scope.go:117] "RemoveContainer" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.739747 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": container with ID starting with 2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117 not found: ID does not exist" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739774 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} err="failed to get container status \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": rpc error: code = NotFound desc = could not find container \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": container with ID starting with 2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739792 4962 scope.go:117] "RemoveContainer" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.740049 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": container with ID starting with 1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe not found: ID does not exist" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740077 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} err="failed to get container status \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": rpc error: code = NotFound desc = could not find container \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": container with ID starting with 1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740095 4962 scope.go:117] "RemoveContainer" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740306 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} err="failed to get container status \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": rpc error: code = NotFound desc = could not find container \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": container with ID starting with 632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740330 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740634 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} err="failed to get container status \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": rpc error: code = NotFound desc = could not find container \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": container with ID starting with 0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740741 4962 scope.go:117] "RemoveContainer" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.741236 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} err="failed to get container status \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": rpc error: code = NotFound desc = could not find container \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": container with ID starting with 195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.741321 4962 scope.go:117] "RemoveContainer" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.741890 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} err="failed to get container status \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": rpc error: code = NotFound desc = could not find container \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": container with ID starting with 50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.741936 4962 scope.go:117] "RemoveContainer" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742245 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} err="failed to get container status \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": rpc error: code = NotFound desc = could not find container \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": container with ID starting with 582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742277 4962 scope.go:117] "RemoveContainer" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742547 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} err="failed to get container status \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": rpc error: code = NotFound desc = could not find container \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": container with ID starting with 9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742574 4962 scope.go:117] "RemoveContainer" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742922 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} err="failed to get container status \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": rpc error: code = NotFound desc = could not find container \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": container with ID starting with 03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742953 4962 scope.go:117] "RemoveContainer" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.743208 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} err="failed to get container status \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": rpc error: code = NotFound desc = could not find container \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": container with ID starting with 36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.743240 4962 scope.go:117] "RemoveContainer" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.744314 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} err="failed to get container status \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": rpc error: code = NotFound desc = could not find container \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": container with ID starting with 2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.744343 4962 scope.go:117] "RemoveContainer" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.744909 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} err="failed to get container status \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": rpc error: code = NotFound desc = could not find container \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": container with ID starting with 1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.744935 4962 scope.go:117] "RemoveContainer" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.745272 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} err="failed to get container status \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": rpc error: code = NotFound desc = could not find container \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": container with ID starting with 632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.745496 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746031 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} err="failed to get container status \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": rpc error: code = NotFound desc = could not find container \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": container with ID starting with 0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746055 4962 scope.go:117] "RemoveContainer" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746413 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} err="failed to get container status \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": rpc error: code = NotFound desc = could not find container \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": container with ID starting with 195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746435 4962 scope.go:117] "RemoveContainer" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746745 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} err="failed to get container status \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": rpc error: code = NotFound desc = could not find container \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": container with ID starting with 50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746782 4962 scope.go:117] "RemoveContainer" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747070 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} err="failed to get container status \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": rpc error: code = NotFound desc = could not find container \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": container with ID starting with 582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747095 4962 scope.go:117] "RemoveContainer" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747370 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} err="failed to get container status \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": rpc error: code = NotFound desc = could not find container \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": container with ID starting with 9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747402 4962 scope.go:117] "RemoveContainer" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747698 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} err="failed to get container status \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": rpc error: code = NotFound desc = could not find container \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": container with ID starting with 03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747726 4962 scope.go:117] "RemoveContainer" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748036 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} err="failed to get container status \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": rpc error: code = NotFound desc = could not find container \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": container with ID starting with 36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748068 4962 scope.go:117] "RemoveContainer" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748334 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} err="failed to get container status \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": rpc error: code = NotFound desc = could not find container \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": container with ID starting with 2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748359 4962 scope.go:117] "RemoveContainer" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748656 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} err="failed to get container status \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": rpc error: code = NotFound desc = could not find container \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": container with ID starting with 1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748687 4962 scope.go:117] "RemoveContainer" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749017 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} err="failed to get container status \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": rpc error: code = NotFound desc = could not find container \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": container with ID starting with 632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749047 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749408 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} err="failed to get container status \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": rpc error: code = NotFound desc = could not find container \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": container with ID starting with 0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749426 4962 scope.go:117] "RemoveContainer" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749719 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} err="failed to get container status \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": rpc error: code = NotFound desc = could not find container \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": container with ID starting with 195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749849 4962 scope.go:117] "RemoveContainer" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.750281 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} err="failed to get container status \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": rpc error: code = NotFound desc = could not find container \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": container with ID starting with 50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.750304 4962 scope.go:117] "RemoveContainer" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.750753 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} err="failed to get container status \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": rpc error: code = NotFound desc = could not find container \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": container with ID starting with 582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.750811 4962 scope.go:117] "RemoveContainer" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.751181 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} err="failed to get container status \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": rpc error: code = NotFound desc = could not find container \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": container with ID starting with 9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.751214 4962 scope.go:117] "RemoveContainer" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.751529 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} err="failed to get container status \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": rpc error: code = NotFound desc = could not find container \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": container with ID starting with 03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.751693 4962 scope.go:117] "RemoveContainer" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.752107 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} err="failed to get container status \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": rpc error: code = NotFound desc = could not find container \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": container with ID starting with 36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.752133 4962 scope.go:117] "RemoveContainer" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.752701 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} err="failed to get container status \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": rpc error: code = NotFound desc = could not find container \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": container with ID starting with 2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.752749 4962 scope.go:117] "RemoveContainer" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.753222 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} err="failed to get container status \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": rpc error: code = NotFound desc = could not find container \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": container with ID starting with 1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.852101 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-99b2s"] Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.862867 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-99b2s"] Feb 20 10:06:21 crc kubenswrapper[4962]: I0220 10:06:21.154644 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" path="/var/lib/kubelet/pods/2abd2b70-bb78-49a0-b930-cd066384e803/volumes" Feb 20 10:06:21 crc kubenswrapper[4962]: I0220 10:06:21.484301 4962 generic.go:334] "Generic (PLEG): container finished" podID="e12256fd-84a5-4a79-b750-20b5a64bd4c9" containerID="1e0c0d88dc5ba3cd8d61d4bf1920c494da947e6d2fe514fe1984e301582cff94" exitCode=0 Feb 20 10:06:21 crc kubenswrapper[4962]: I0220 10:06:21.485461 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerDied","Data":"1e0c0d88dc5ba3cd8d61d4bf1920c494da947e6d2fe514fe1984e301582cff94"} Feb 20 10:06:21 crc kubenswrapper[4962]: I0220 10:06:21.485657 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"a175345ba64c0874825a85220209a3a8247dff3c0e0beb24f2d075a628b1279a"} Feb 20 10:06:21 crc kubenswrapper[4962]: I0220 10:06:21.493043 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/2.log" Feb 20 10:06:22 crc kubenswrapper[4962]: I0220 10:06:22.506401 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"92e3dde518304618e01ddb3cc717c779c80e50fc6b3751f36f735fae92c42114"} Feb 20 10:06:22 crc kubenswrapper[4962]: I0220 10:06:22.507779 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"2f58ea068ce21b88343d88c9636ddb42f308bc3735b14e8567e338d309ce9d6a"} Feb 20 10:06:22 crc kubenswrapper[4962]: I0220 10:06:22.507820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"afebb2b6d25667fd01a6758a1584dec69db4671c42b4025177214e7234b98039"} Feb 20 10:06:22 crc kubenswrapper[4962]: I0220 10:06:22.507836 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"32bfd6870b90b35716f3e8be00392c97ed3eaaba2cd7a5263cffd2fae9b90665"} Feb 20 10:06:22 crc kubenswrapper[4962]: I0220 10:06:22.507847 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"5723db0484710231bd6b6e5a91643e47eb22ac3f7e4212e0f75b13cd108221f3"} Feb 20 10:06:23 crc kubenswrapper[4962]: I0220 10:06:23.521125 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"ac789f0ba10acfdc335d55607b1373243f9a97868dcdf1e1c1b083181e306baf"} Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.306740 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9v9g5"] Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.308555 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.312331 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.316245 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.317040 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.316839 4962 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-bwxwq" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.358128 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.358236 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.358304 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zszw4\" (UniqueName: \"kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.459710 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.459847 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zszw4\" (UniqueName: \"kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.460043 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.460185 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.461054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.501457 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zszw4\" (UniqueName: \"kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.553462 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"fe1ae8910f5f801a05a874e20d8dc09e0d5a8ff072bd234b90cdff966cd46572"} Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.628726 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: E0220 10:06:25.671013 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(40e31d86b4d5a1ae197fb07238a7f4f9a6b312119a93aaeffcb012d00970fee0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 10:06:25 crc kubenswrapper[4962]: E0220 10:06:25.671145 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(40e31d86b4d5a1ae197fb07238a7f4f9a6b312119a93aaeffcb012d00970fee0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: E0220 10:06:25.671185 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(40e31d86b4d5a1ae197fb07238a7f4f9a6b312119a93aaeffcb012d00970fee0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: E0220 10:06:25.671281 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(40e31d86b4d5a1ae197fb07238a7f4f9a6b312119a93aaeffcb012d00970fee0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9v9g5" podUID="6423ea5e-20ed-4977-a842-2bc521939341" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.582674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"7b825f72902849a1879d093d4836387732d06857ded48764e201dfdb9b927cc6"} Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.583273 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.583298 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.583313 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.628165 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.633193 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" podStartSLOduration=7.633162822 podStartE2EDuration="7.633162822s" podCreationTimestamp="2026-02-20 10:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:06:27.621288642 +0000 UTC m=+679.203760508" watchObservedRunningTime="2026-02-20 10:06:27.633162822 +0000 UTC m=+679.215634668" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.657365 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:28 crc kubenswrapper[4962]: I0220 10:06:28.103477 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9v9g5"] Feb 20 10:06:28 crc kubenswrapper[4962]: I0220 10:06:28.103649 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:28 crc kubenswrapper[4962]: I0220 10:06:28.104216 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:28 crc kubenswrapper[4962]: E0220 10:06:28.135125 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(99e25b7d0b32fa32c7890472a0271039c39f35dccfb2b21357e2f277e54734e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 10:06:28 crc kubenswrapper[4962]: E0220 10:06:28.135265 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(99e25b7d0b32fa32c7890472a0271039c39f35dccfb2b21357e2f277e54734e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:28 crc kubenswrapper[4962]: E0220 10:06:28.135341 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(99e25b7d0b32fa32c7890472a0271039c39f35dccfb2b21357e2f277e54734e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:28 crc kubenswrapper[4962]: E0220 10:06:28.135445 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(99e25b7d0b32fa32c7890472a0271039c39f35dccfb2b21357e2f277e54734e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9v9g5" podUID="6423ea5e-20ed-4977-a842-2bc521939341" Feb 20 10:06:35 crc kubenswrapper[4962]: I0220 10:06:35.139189 4962 scope.go:117] "RemoveContainer" containerID="1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef" Feb 20 10:06:35 crc kubenswrapper[4962]: E0220 10:06:35.140175 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wqwgj_openshift-multus(1957ac70-30f9-48c2-a82b-72aa3b7a883a)\"" pod="openshift-multus/multus-wqwgj" podUID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" Feb 20 10:06:40 crc kubenswrapper[4962]: I0220 10:06:40.138859 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:40 crc kubenswrapper[4962]: I0220 10:06:40.139349 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:40 crc kubenswrapper[4962]: E0220 10:06:40.168914 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(713cdb0df4b4a43cbc42a56189ad0095c3557594cb124045197252aa33cb76b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 10:06:40 crc kubenswrapper[4962]: E0220 10:06:40.169255 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(713cdb0df4b4a43cbc42a56189ad0095c3557594cb124045197252aa33cb76b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:40 crc kubenswrapper[4962]: E0220 10:06:40.169293 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(713cdb0df4b4a43cbc42a56189ad0095c3557594cb124045197252aa33cb76b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:40 crc kubenswrapper[4962]: E0220 10:06:40.169359 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(713cdb0df4b4a43cbc42a56189ad0095c3557594cb124045197252aa33cb76b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9v9g5" podUID="6423ea5e-20ed-4977-a842-2bc521939341" Feb 20 10:06:41 crc kubenswrapper[4962]: I0220 10:06:41.508227 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:06:41 crc kubenswrapper[4962]: I0220 10:06:41.510050 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:06:48 crc kubenswrapper[4962]: I0220 10:06:48.139340 4962 scope.go:117] "RemoveContainer" containerID="1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef" Feb 20 10:06:48 crc kubenswrapper[4962]: I0220 10:06:48.728212 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/2.log" Feb 20 10:06:48 crc kubenswrapper[4962]: I0220 10:06:48.728746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerStarted","Data":"6a05be812a6e15d39000d5ee5643496f369b2005317fcf7e7f04250bb5188bfc"} Feb 20 10:06:50 crc kubenswrapper[4962]: I0220 10:06:50.692384 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:52 crc kubenswrapper[4962]: I0220 10:06:52.138497 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:52 crc kubenswrapper[4962]: I0220 10:06:52.139681 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:52 crc kubenswrapper[4962]: I0220 10:06:52.907144 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:06:52 crc kubenswrapper[4962]: I0220 10:06:52.907373 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9v9g5"] Feb 20 10:06:53 crc kubenswrapper[4962]: I0220 10:06:53.865362 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9v9g5" event={"ID":"6423ea5e-20ed-4977-a842-2bc521939341","Type":"ContainerStarted","Data":"24267bde3cf143a1f44599c25d38b0e2ca1b3bcf78870f4f856537b7cd68bd3f"} Feb 20 10:06:54 crc kubenswrapper[4962]: I0220 10:06:54.888486 4962 generic.go:334] "Generic (PLEG): container finished" podID="6423ea5e-20ed-4977-a842-2bc521939341" containerID="dd81866a8883595a9a43e5321d2a1e397058906782cb6839d22126aa9d907feb" exitCode=0 Feb 20 10:06:54 crc kubenswrapper[4962]: I0220 10:06:54.888577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9v9g5" event={"ID":"6423ea5e-20ed-4977-a842-2bc521939341","Type":"ContainerDied","Data":"dd81866a8883595a9a43e5321d2a1e397058906782cb6839d22126aa9d907feb"} Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.232993 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.308306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt\") pod \"6423ea5e-20ed-4977-a842-2bc521939341\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.308375 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zszw4\" (UniqueName: \"kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4\") pod \"6423ea5e-20ed-4977-a842-2bc521939341\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.308412 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage\") pod \"6423ea5e-20ed-4977-a842-2bc521939341\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.308449 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6423ea5e-20ed-4977-a842-2bc521939341" (UID: "6423ea5e-20ed-4977-a842-2bc521939341"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.308705 4962 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.313292 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4" (OuterVolumeSpecName: "kube-api-access-zszw4") pod "6423ea5e-20ed-4977-a842-2bc521939341" (UID: "6423ea5e-20ed-4977-a842-2bc521939341"). InnerVolumeSpecName "kube-api-access-zszw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.330724 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6423ea5e-20ed-4977-a842-2bc521939341" (UID: "6423ea5e-20ed-4977-a842-2bc521939341"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.410533 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zszw4\" (UniqueName: \"kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.410582 4962 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.908119 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9v9g5" event={"ID":"6423ea5e-20ed-4977-a842-2bc521939341","Type":"ContainerDied","Data":"24267bde3cf143a1f44599c25d38b0e2ca1b3bcf78870f4f856537b7cd68bd3f"} Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.908202 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24267bde3cf143a1f44599c25d38b0e2ca1b3bcf78870f4f856537b7cd68bd3f" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.908246 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.095058 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626"] Feb 20 10:07:04 crc kubenswrapper[4962]: E0220 10:07:04.096572 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6423ea5e-20ed-4977-a842-2bc521939341" containerName="storage" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.096699 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6423ea5e-20ed-4977-a842-2bc521939341" containerName="storage" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.097409 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6423ea5e-20ed-4977-a842-2bc521939341" containerName="storage" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.100692 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.104209 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.117357 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626"] Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.231877 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x496s\" (UniqueName: \"kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.231989 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.232030 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.333923 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x496s\" (UniqueName: \"kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.334243 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.334386 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.335253 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.336043 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.352815 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x496s\" (UniqueName: \"kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.452623 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.635085 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626"] Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.974059 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerStarted","Data":"d91be3332b27241cee61bcd7398abd7cbbcb1f51878aeb387f65fde8ffd73790"} Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.974176 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerStarted","Data":"72a64b6a19e16dda103631e55a0bb6921c7420e56b8354feeb8259198c1f1b3c"} Feb 20 10:07:05 crc kubenswrapper[4962]: I0220 10:07:05.986018 4962 generic.go:334] "Generic (PLEG): container finished" podID="650d9c53-94de-499d-8498-53afa3428c06" containerID="d91be3332b27241cee61bcd7398abd7cbbcb1f51878aeb387f65fde8ffd73790" exitCode=0 Feb 20 10:07:05 crc kubenswrapper[4962]: I0220 10:07:05.986166 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerDied","Data":"d91be3332b27241cee61bcd7398abd7cbbcb1f51878aeb387f65fde8ffd73790"} Feb 20 10:07:07 crc kubenswrapper[4962]: I0220 10:07:07.998949 4962 generic.go:334] "Generic (PLEG): container finished" podID="650d9c53-94de-499d-8498-53afa3428c06" containerID="4f992657730342a0f2eba9a2f8eaffbf3b181d1276791b34eb4ef99967e83016" exitCode=0 Feb 20 10:07:07 crc kubenswrapper[4962]: I0220 10:07:07.999002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerDied","Data":"4f992657730342a0f2eba9a2f8eaffbf3b181d1276791b34eb4ef99967e83016"} Feb 20 10:07:09 crc kubenswrapper[4962]: I0220 10:07:09.011901 4962 generic.go:334] "Generic (PLEG): container finished" podID="650d9c53-94de-499d-8498-53afa3428c06" containerID="acb768d317e1555893a5b7aedc9f487d7cb71f8dabc978f5cafa020c7b1863ba" exitCode=0 Feb 20 10:07:09 crc kubenswrapper[4962]: I0220 10:07:09.011982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerDied","Data":"acb768d317e1555893a5b7aedc9f487d7cb71f8dabc978f5cafa020c7b1863ba"} Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.315084 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.445353 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x496s\" (UniqueName: \"kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s\") pod \"650d9c53-94de-499d-8498-53afa3428c06\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.445439 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util\") pod \"650d9c53-94de-499d-8498-53afa3428c06\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.445587 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle\") pod \"650d9c53-94de-499d-8498-53afa3428c06\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.446121 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle" (OuterVolumeSpecName: "bundle") pod "650d9c53-94de-499d-8498-53afa3428c06" (UID: "650d9c53-94de-499d-8498-53afa3428c06"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.450951 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s" (OuterVolumeSpecName: "kube-api-access-x496s") pod "650d9c53-94de-499d-8498-53afa3428c06" (UID: "650d9c53-94de-499d-8498-53afa3428c06"). InnerVolumeSpecName "kube-api-access-x496s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.548022 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.548072 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x496s\" (UniqueName: \"kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.553311 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util" (OuterVolumeSpecName: "util") pod "650d9c53-94de-499d-8498-53afa3428c06" (UID: "650d9c53-94de-499d-8498-53afa3428c06"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.649008 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:11 crc kubenswrapper[4962]: I0220 10:07:11.030784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerDied","Data":"72a64b6a19e16dda103631e55a0bb6921c7420e56b8354feeb8259198c1f1b3c"} Feb 20 10:07:11 crc kubenswrapper[4962]: I0220 10:07:11.030834 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72a64b6a19e16dda103631e55a0bb6921c7420e56b8354feeb8259198c1f1b3c" Feb 20 10:07:11 crc kubenswrapper[4962]: I0220 10:07:11.030888 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:11 crc kubenswrapper[4962]: I0220 10:07:11.508425 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:07:11 crc kubenswrapper[4962]: I0220 10:07:11.508948 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.162345 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-nkzm2"] Feb 20 10:07:13 crc kubenswrapper[4962]: E0220 10:07:13.162708 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="extract" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.162726 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="extract" Feb 20 10:07:13 crc kubenswrapper[4962]: E0220 10:07:13.162745 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="util" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.162754 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="util" Feb 20 10:07:13 crc kubenswrapper[4962]: E0220 10:07:13.162766 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="pull" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.162774 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="pull" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.162890 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="extract" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.163425 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.170404 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-46q2p" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.171154 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.175094 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.175289 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-nkzm2"] Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.313407 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4jj\" (UniqueName: \"kubernetes.io/projected/cffc71cf-18b7-4733-b863-19b8664b5cf4-kube-api-access-xx4jj\") pod \"nmstate-operator-694c9596b7-nkzm2\" (UID: \"cffc71cf-18b7-4733-b863-19b8664b5cf4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.415041 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4jj\" (UniqueName: \"kubernetes.io/projected/cffc71cf-18b7-4733-b863-19b8664b5cf4-kube-api-access-xx4jj\") pod \"nmstate-operator-694c9596b7-nkzm2\" (UID: \"cffc71cf-18b7-4733-b863-19b8664b5cf4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.433885 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4jj\" (UniqueName: \"kubernetes.io/projected/cffc71cf-18b7-4733-b863-19b8664b5cf4-kube-api-access-xx4jj\") pod \"nmstate-operator-694c9596b7-nkzm2\" (UID: \"cffc71cf-18b7-4733-b863-19b8664b5cf4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.483576 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.679565 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-nkzm2"] Feb 20 10:07:13 crc kubenswrapper[4962]: W0220 10:07:13.691727 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcffc71cf_18b7_4733_b863_19b8664b5cf4.slice/crio-c9ec9d6dcf363c113f2d52a43d4b95553ef776e08476fce579f6f6b560d45443 WatchSource:0}: Error finding container c9ec9d6dcf363c113f2d52a43d4b95553ef776e08476fce579f6f6b560d45443: Status 404 returned error can't find the container with id c9ec9d6dcf363c113f2d52a43d4b95553ef776e08476fce579f6f6b560d45443 Feb 20 10:07:14 crc kubenswrapper[4962]: I0220 10:07:14.050860 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" event={"ID":"cffc71cf-18b7-4733-b863-19b8664b5cf4","Type":"ContainerStarted","Data":"c9ec9d6dcf363c113f2d52a43d4b95553ef776e08476fce579f6f6b560d45443"} Feb 20 10:07:16 crc kubenswrapper[4962]: I0220 10:07:16.064916 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" event={"ID":"cffc71cf-18b7-4733-b863-19b8664b5cf4","Type":"ContainerStarted","Data":"7d1a3616fffbfda6dbc09182574a8da7aa4ffec6021e3f24b359b239cfb0e195"} Feb 20 10:07:16 crc kubenswrapper[4962]: I0220 10:07:16.082716 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" podStartSLOduration=1.332512538 podStartE2EDuration="3.082694122s" podCreationTimestamp="2026-02-20 10:07:13 +0000 UTC" firstStartedPulling="2026-02-20 10:07:13.694986197 +0000 UTC m=+725.277458043" lastFinishedPulling="2026-02-20 10:07:15.445167781 +0000 UTC m=+727.027639627" observedRunningTime="2026-02-20 10:07:16.079194702 +0000 UTC m=+727.661666558" watchObservedRunningTime="2026-02-20 10:07:16.082694122 +0000 UTC m=+727.665165968" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.270504 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.272507 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.275956 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-tp5zn" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.281788 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.288228 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.289147 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.296043 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-frtsf"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.297076 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.301700 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.315263 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386285 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nqwp\" (UniqueName: \"kubernetes.io/projected/edcc687e-09ef-4048-8db7-d67e6fe23212-kube-api-access-6nqwp\") pod \"nmstate-metrics-58c85c668d-6x8wh\" (UID: \"edcc687e-09ef-4048-8db7-d67e6fe23212\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386340 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-ovs-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386369 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-nmstate-lock\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386445 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkj9j\" (UniqueName: \"kubernetes.io/projected/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-kube-api-access-zkj9j\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386471 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-dbus-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386566 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386608 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb7dj\" (UniqueName: \"kubernetes.io/projected/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-kube-api-access-rb7dj\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.409356 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.410118 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.421135 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.421176 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.426930 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-27lrn" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.428023 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488247 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-nmstate-lock\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488332 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll4cj\" (UniqueName: \"kubernetes.io/projected/e17e90c9-fe19-4544-9a79-bffc8072a763-kube-api-access-ll4cj\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488361 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkj9j\" (UniqueName: \"kubernetes.io/projected/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-kube-api-access-zkj9j\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488383 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-dbus-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488446 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488516 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb7dj\" (UniqueName: \"kubernetes.io/projected/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-kube-api-access-rb7dj\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488581 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nqwp\" (UniqueName: \"kubernetes.io/projected/edcc687e-09ef-4048-8db7-d67e6fe23212-kube-api-access-6nqwp\") pod \"nmstate-metrics-58c85c668d-6x8wh\" (UID: \"edcc687e-09ef-4048-8db7-d67e6fe23212\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488659 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-ovs-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488679 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e17e90c9-fe19-4544-9a79-bffc8072a763-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488751 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-nmstate-lock\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: E0220 10:07:24.489292 4962 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 20 10:07:24 crc kubenswrapper[4962]: E0220 10:07:24.489353 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair podName:a453e12b-e95c-4c04-b67b-b5bc6527a3ab nodeName:}" failed. No retries permitted until 2026-02-20 10:07:24.989332613 +0000 UTC m=+736.571804459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair") pod "nmstate-webhook-866bcb46dc-l2lqb" (UID: "a453e12b-e95c-4c04-b67b-b5bc6527a3ab") : secret "openshift-nmstate-webhook" not found Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.489516 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-ovs-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.489297 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-dbus-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.511262 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nqwp\" (UniqueName: \"kubernetes.io/projected/edcc687e-09ef-4048-8db7-d67e6fe23212-kube-api-access-6nqwp\") pod \"nmstate-metrics-58c85c668d-6x8wh\" (UID: \"edcc687e-09ef-4048-8db7-d67e6fe23212\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.524524 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkj9j\" (UniqueName: \"kubernetes.io/projected/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-kube-api-access-zkj9j\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.524620 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb7dj\" (UniqueName: \"kubernetes.io/projected/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-kube-api-access-rb7dj\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.590340 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e17e90c9-fe19-4544-9a79-bffc8072a763-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.590410 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll4cj\" (UniqueName: \"kubernetes.io/projected/e17e90c9-fe19-4544-9a79-bffc8072a763-kube-api-access-ll4cj\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.590452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: E0220 10:07:24.590659 4962 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 20 10:07:24 crc kubenswrapper[4962]: E0220 10:07:24.590723 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert podName:e17e90c9-fe19-4544-9a79-bffc8072a763 nodeName:}" failed. No retries permitted until 2026-02-20 10:07:25.090701561 +0000 UTC m=+736.673173397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-2hqd7" (UID: "e17e90c9-fe19-4544-9a79-bffc8072a763") : secret "plugin-serving-cert" not found Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.591961 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e17e90c9-fe19-4544-9a79-bffc8072a763-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.601627 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.608956 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f9d58689-28fst"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.610409 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.619329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll4cj\" (UniqueName: \"kubernetes.io/projected/e17e90c9-fe19-4544-9a79-bffc8072a763-kube-api-access-ll4cj\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.627089 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.689739 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f9d58689-28fst"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.691931 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.691983 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qhq2\" (UniqueName: \"kubernetes.io/projected/0de52220-59c4-423b-80c3-b737466ac45f-kube-api-access-6qhq2\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.692025 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-trusted-ca-bundle\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.692097 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-service-ca\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.692119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-console-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.692163 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-oauth-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.692180 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-oauth-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.793884 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.793955 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhq2\" (UniqueName: \"kubernetes.io/projected/0de52220-59c4-423b-80c3-b737466ac45f-kube-api-access-6qhq2\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.793989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-trusted-ca-bundle\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.794058 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-service-ca\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.794212 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-console-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.794420 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-oauth-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.794458 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-oauth-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.795087 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-service-ca\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.795449 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-oauth-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.796549 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-console-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.799755 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-trusted-ca-bundle\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.802467 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-oauth-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.806244 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.812170 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhq2\" (UniqueName: \"kubernetes.io/projected/0de52220-59c4-423b-80c3-b737466ac45f-kube-api-access-6qhq2\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.850154 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.997719 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.000781 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.001523 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.099785 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.103895 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.121967 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-frtsf" event={"ID":"5056ae4f-c2f7-41f5-8e12-b7b5d8996852","Type":"ContainerStarted","Data":"72b6617c0e471f826f5fc5f1b60506b802b42453bae6a1cefce486c81091337f"} Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.123120 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" event={"ID":"edcc687e-09ef-4048-8db7-d67e6fe23212","Type":"ContainerStarted","Data":"8b9b247a0dd6e5e8b90ee0af81501f91d9e8a8835eb8907bb9a741ad95002d55"} Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.215859 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.244450 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f9d58689-28fst"] Feb 20 10:07:25 crc kubenswrapper[4962]: W0220 10:07:25.253741 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0de52220_59c4_423b_80c3_b737466ac45f.slice/crio-99fba780e600e3cc03255a1a0738f81cbe52918b0b1a0f213573e1346807ac25 WatchSource:0}: Error finding container 99fba780e600e3cc03255a1a0738f81cbe52918b0b1a0f213573e1346807ac25: Status 404 returned error can't find the container with id 99fba780e600e3cc03255a1a0738f81cbe52918b0b1a0f213573e1346807ac25 Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.335881 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.480786 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb"] Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.575280 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7"] Feb 20 10:07:26 crc kubenswrapper[4962]: I0220 10:07:26.130752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" event={"ID":"a453e12b-e95c-4c04-b67b-b5bc6527a3ab","Type":"ContainerStarted","Data":"f5aae3b0bdc26260dee4febc6406bb1315eff3d733d47005692e3dbe68f85c77"} Feb 20 10:07:26 crc kubenswrapper[4962]: I0220 10:07:26.132883 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f9d58689-28fst" event={"ID":"0de52220-59c4-423b-80c3-b737466ac45f","Type":"ContainerStarted","Data":"9670c5a06f5878d5fd68f1475e9173d76b30ab24f827a7ae446f35372fa9c420"} Feb 20 10:07:26 crc kubenswrapper[4962]: I0220 10:07:26.132936 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f9d58689-28fst" event={"ID":"0de52220-59c4-423b-80c3-b737466ac45f","Type":"ContainerStarted","Data":"99fba780e600e3cc03255a1a0738f81cbe52918b0b1a0f213573e1346807ac25"} Feb 20 10:07:26 crc kubenswrapper[4962]: I0220 10:07:26.134449 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" event={"ID":"e17e90c9-fe19-4544-9a79-bffc8072a763","Type":"ContainerStarted","Data":"cf7d427dabfe35c4069684117a2c755c5d41c3ce4825a1681c1d2fd34a2cbefa"} Feb 20 10:07:26 crc kubenswrapper[4962]: I0220 10:07:26.153709 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f9d58689-28fst" podStartSLOduration=2.153686568 podStartE2EDuration="2.153686568s" podCreationTimestamp="2026-02-20 10:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:07:26.148773834 +0000 UTC m=+737.731245700" watchObservedRunningTime="2026-02-20 10:07:26.153686568 +0000 UTC m=+737.736158434" Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.162349 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-frtsf" event={"ID":"5056ae4f-c2f7-41f5-8e12-b7b5d8996852","Type":"ContainerStarted","Data":"6d065815732fe6d25416667707bbe455c920b024a686ef500398a7c7231959af"} Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.164411 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.166414 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" event={"ID":"edcc687e-09ef-4048-8db7-d67e6fe23212","Type":"ContainerStarted","Data":"5a2415cc832d88aebc7f468292ad0edc892c0d406540efcd8044035e762e0d50"} Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.169451 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" event={"ID":"a453e12b-e95c-4c04-b67b-b5bc6527a3ab","Type":"ContainerStarted","Data":"f15f78fd82d3623376ada9062cfec324b7a9ff4daeacc55c90514a085bb07a45"} Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.171289 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.181576 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-frtsf" podStartSLOduration=1.8896062150000001 podStartE2EDuration="4.181542706s" podCreationTimestamp="2026-02-20 10:07:24 +0000 UTC" firstStartedPulling="2026-02-20 10:07:24.669395367 +0000 UTC m=+736.251867213" lastFinishedPulling="2026-02-20 10:07:26.961331858 +0000 UTC m=+738.543803704" observedRunningTime="2026-02-20 10:07:28.176436044 +0000 UTC m=+739.758907920" watchObservedRunningTime="2026-02-20 10:07:28.181542706 +0000 UTC m=+739.764014562" Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.201169 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" podStartSLOduration=2.739946267 podStartE2EDuration="4.201137092s" podCreationTimestamp="2026-02-20 10:07:24 +0000 UTC" firstStartedPulling="2026-02-20 10:07:25.501063043 +0000 UTC m=+737.083534889" lastFinishedPulling="2026-02-20 10:07:26.962253858 +0000 UTC m=+738.544725714" observedRunningTime="2026-02-20 10:07:28.199311354 +0000 UTC m=+739.781783220" watchObservedRunningTime="2026-02-20 10:07:28.201137092 +0000 UTC m=+739.783608948" Feb 20 10:07:29 crc kubenswrapper[4962]: I0220 10:07:29.177633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" event={"ID":"e17e90c9-fe19-4544-9a79-bffc8072a763","Type":"ContainerStarted","Data":"194a6a737cbb07a12489a7fc808522ffde9a3ac0211bd4734fbcb7f5624518e2"} Feb 20 10:07:29 crc kubenswrapper[4962]: I0220 10:07:29.197903 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" podStartSLOduration=2.046729685 podStartE2EDuration="5.19787573s" podCreationTimestamp="2026-02-20 10:07:24 +0000 UTC" firstStartedPulling="2026-02-20 10:07:25.586056016 +0000 UTC m=+737.168527862" lastFinishedPulling="2026-02-20 10:07:28.737202061 +0000 UTC m=+740.319673907" observedRunningTime="2026-02-20 10:07:29.191044494 +0000 UTC m=+740.773516350" watchObservedRunningTime="2026-02-20 10:07:29.19787573 +0000 UTC m=+740.780347576" Feb 20 10:07:31 crc kubenswrapper[4962]: I0220 10:07:31.193555 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" event={"ID":"edcc687e-09ef-4048-8db7-d67e6fe23212","Type":"ContainerStarted","Data":"693b17bbe55b417d4db6934ad6b457e3bf3190e3faf7fd1afcfa29de1bbd2439"} Feb 20 10:07:31 crc kubenswrapper[4962]: I0220 10:07:31.222352 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" podStartSLOduration=1.9733537559999998 podStartE2EDuration="7.222328729s" podCreationTimestamp="2026-02-20 10:07:24 +0000 UTC" firstStartedPulling="2026-02-20 10:07:24.858529625 +0000 UTC m=+736.441001471" lastFinishedPulling="2026-02-20 10:07:30.107504598 +0000 UTC m=+741.689976444" observedRunningTime="2026-02-20 10:07:31.219246163 +0000 UTC m=+742.801718049" watchObservedRunningTime="2026-02-20 10:07:31.222328729 +0000 UTC m=+742.804800595" Feb 20 10:07:34 crc kubenswrapper[4962]: I0220 10:07:34.670451 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:35 crc kubenswrapper[4962]: I0220 10:07:35.000963 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:35 crc kubenswrapper[4962]: I0220 10:07:35.001038 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:35 crc kubenswrapper[4962]: I0220 10:07:35.008555 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:35 crc kubenswrapper[4962]: I0220 10:07:35.232687 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:35 crc kubenswrapper[4962]: I0220 10:07:35.320113 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 10:07:40 crc kubenswrapper[4962]: I0220 10:07:40.229505 4962 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 10:07:41 crc kubenswrapper[4962]: I0220 10:07:41.508006 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:07:41 crc kubenswrapper[4962]: I0220 10:07:41.508654 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:07:41 crc kubenswrapper[4962]: I0220 10:07:41.508774 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:07:41 crc kubenswrapper[4962]: I0220 10:07:41.510347 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:07:41 crc kubenswrapper[4962]: I0220 10:07:41.510552 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14" gracePeriod=600 Feb 20 10:07:42 crc kubenswrapper[4962]: I0220 10:07:42.284806 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14" exitCode=0 Feb 20 10:07:42 crc kubenswrapper[4962]: I0220 10:07:42.284872 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14"} Feb 20 10:07:42 crc kubenswrapper[4962]: I0220 10:07:42.285341 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce"} Feb 20 10:07:42 crc kubenswrapper[4962]: I0220 10:07:42.285360 4962 scope.go:117] "RemoveContainer" containerID="1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838" Feb 20 10:07:45 crc kubenswrapper[4962]: I0220 10:07:45.225226 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.781048 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp"] Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.784126 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.786916 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.800981 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp"] Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.863953 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.863990 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.864085 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqvpj\" (UniqueName: \"kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.965237 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.965285 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.965331 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqvpj\" (UniqueName: \"kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.965941 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.966068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.992836 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqvpj\" (UniqueName: \"kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.104571 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.341347 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp"] Feb 20 10:08:00 crc kubenswrapper[4962]: W0220 10:08:00.347660 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15223064_e16f_4407_a15a_2105151aa73f.slice/crio-c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e WatchSource:0}: Error finding container c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e: Status 404 returned error can't find the container with id c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.372914 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nwfk6" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" containerID="cri-o://8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127" gracePeriod=15 Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.416075 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" event={"ID":"15223064-e16f-4407-a15a-2105151aa73f","Type":"ContainerStarted","Data":"c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e"} Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.723379 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nwfk6_09cfdba9-bfda-455d-b13e-58a6ea5a7d5a/console/0.log" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.723479 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777335 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777384 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5fb2\" (UniqueName: \"kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777461 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777526 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777666 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777829 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.778406 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.778451 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca" (OuterVolumeSpecName: "service-ca") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.778499 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.778557 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config" (OuterVolumeSpecName: "console-config") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.782838 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.782901 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2" (OuterVolumeSpecName: "kube-api-access-h5fb2") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "kube-api-access-h5fb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.783080 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879831 4962 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879870 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879884 4962 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879896 4962 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879910 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5fb2\" (UniqueName: \"kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879923 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879932 4962 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.107928 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:01 crc kubenswrapper[4962]: E0220 10:08:01.108351 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.108385 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.108585 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.110138 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.117650 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.183813 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8ttl\" (UniqueName: \"kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.184243 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.184302 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.285414 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8ttl\" (UniqueName: \"kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.285505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.285566 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.286348 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.286442 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.312158 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8ttl\" (UniqueName: \"kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423457 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nwfk6_09cfdba9-bfda-455d-b13e-58a6ea5a7d5a/console/0.log" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423504 4962 generic.go:334] "Generic (PLEG): container finished" podID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerID="8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127" exitCode=2 Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423558 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwfk6" event={"ID":"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a","Type":"ContainerDied","Data":"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127"} Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwfk6" event={"ID":"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a","Type":"ContainerDied","Data":"7a175f5752b9da8dd07abe01e0077ca08911cfa3fb3fa2f627ad42bdc14904eb"} Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423645 4962 scope.go:117] "RemoveContainer" containerID="8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423771 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.427225 4962 generic.go:334] "Generic (PLEG): container finished" podID="15223064-e16f-4407-a15a-2105151aa73f" containerID="52e884a20e90cce670d676b67988c352a46ad24b3200f1d7087c910ee7e23935" exitCode=0 Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.427278 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" event={"ID":"15223064-e16f-4407-a15a-2105151aa73f","Type":"ContainerDied","Data":"52e884a20e90cce670d676b67988c352a46ad24b3200f1d7087c910ee7e23935"} Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.452184 4962 scope.go:117] "RemoveContainer" containerID="8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127" Feb 20 10:08:01 crc kubenswrapper[4962]: E0220 10:08:01.452675 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127\": container with ID starting with 8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127 not found: ID does not exist" containerID="8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.452735 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127"} err="failed to get container status \"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127\": rpc error: code = NotFound desc = could not find container \"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127\": container with ID starting with 8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127 not found: ID does not exist" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.470473 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.478383 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.484093 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.584344 4962 patch_prober.go:28] interesting pod/console-f9d7485db-nwfk6 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.584433 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-nwfk6" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.698902 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:01 crc kubenswrapper[4962]: W0220 10:08:01.706908 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2998604a_1adc_4333_9c8a_a4128085b7ce.slice/crio-2ceeda16e7e563747039db6289ac67533cb60df02419917452a4675b9ac9c448 WatchSource:0}: Error finding container 2ceeda16e7e563747039db6289ac67533cb60df02419917452a4675b9ac9c448: Status 404 returned error can't find the container with id 2ceeda16e7e563747039db6289ac67533cb60df02419917452a4675b9ac9c448 Feb 20 10:08:02 crc kubenswrapper[4962]: I0220 10:08:02.433831 4962 generic.go:334] "Generic (PLEG): container finished" podID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerID="c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301" exitCode=0 Feb 20 10:08:02 crc kubenswrapper[4962]: I0220 10:08:02.433908 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerDied","Data":"c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301"} Feb 20 10:08:02 crc kubenswrapper[4962]: I0220 10:08:02.434305 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerStarted","Data":"2ceeda16e7e563747039db6289ac67533cb60df02419917452a4675b9ac9c448"} Feb 20 10:08:03 crc kubenswrapper[4962]: I0220 10:08:03.151091 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" path="/var/lib/kubelet/pods/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a/volumes" Feb 20 10:08:03 crc kubenswrapper[4962]: I0220 10:08:03.443847 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerStarted","Data":"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef"} Feb 20 10:08:03 crc kubenswrapper[4962]: I0220 10:08:03.447045 4962 generic.go:334] "Generic (PLEG): container finished" podID="15223064-e16f-4407-a15a-2105151aa73f" containerID="153bb86b7a4f78ff9046ba5d361d4e56f087ef2bc141daa0bea58590d78beda6" exitCode=0 Feb 20 10:08:03 crc kubenswrapper[4962]: I0220 10:08:03.447090 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" event={"ID":"15223064-e16f-4407-a15a-2105151aa73f","Type":"ContainerDied","Data":"153bb86b7a4f78ff9046ba5d361d4e56f087ef2bc141daa0bea58590d78beda6"} Feb 20 10:08:04 crc kubenswrapper[4962]: I0220 10:08:04.459679 4962 generic.go:334] "Generic (PLEG): container finished" podID="15223064-e16f-4407-a15a-2105151aa73f" containerID="a0978366b2c5ace9202c0106b4e1591f07091e0ebd4532dff4e4a9227ba670b8" exitCode=0 Feb 20 10:08:04 crc kubenswrapper[4962]: I0220 10:08:04.459786 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" event={"ID":"15223064-e16f-4407-a15a-2105151aa73f","Type":"ContainerDied","Data":"a0978366b2c5ace9202c0106b4e1591f07091e0ebd4532dff4e4a9227ba670b8"} Feb 20 10:08:04 crc kubenswrapper[4962]: I0220 10:08:04.463566 4962 generic.go:334] "Generic (PLEG): container finished" podID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerID="daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef" exitCode=0 Feb 20 10:08:04 crc kubenswrapper[4962]: I0220 10:08:04.463646 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerDied","Data":"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef"} Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.475857 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerStarted","Data":"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8"} Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.513012 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gq5bb" podStartSLOduration=2.107882801 podStartE2EDuration="4.512950522s" podCreationTimestamp="2026-02-20 10:08:01 +0000 UTC" firstStartedPulling="2026-02-20 10:08:02.454288705 +0000 UTC m=+774.036760551" lastFinishedPulling="2026-02-20 10:08:04.859356426 +0000 UTC m=+776.441828272" observedRunningTime="2026-02-20 10:08:05.508968177 +0000 UTC m=+777.091440073" watchObservedRunningTime="2026-02-20 10:08:05.512950522 +0000 UTC m=+777.095422398" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.740776 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.764496 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqvpj\" (UniqueName: \"kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj\") pod \"15223064-e16f-4407-a15a-2105151aa73f\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.764676 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util\") pod \"15223064-e16f-4407-a15a-2105151aa73f\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.764710 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle\") pod \"15223064-e16f-4407-a15a-2105151aa73f\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.765874 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle" (OuterVolumeSpecName: "bundle") pod "15223064-e16f-4407-a15a-2105151aa73f" (UID: "15223064-e16f-4407-a15a-2105151aa73f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.766533 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.775954 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj" (OuterVolumeSpecName: "kube-api-access-qqvpj") pod "15223064-e16f-4407-a15a-2105151aa73f" (UID: "15223064-e16f-4407-a15a-2105151aa73f"). InnerVolumeSpecName "kube-api-access-qqvpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.778683 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util" (OuterVolumeSpecName: "util") pod "15223064-e16f-4407-a15a-2105151aa73f" (UID: "15223064-e16f-4407-a15a-2105151aa73f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.868492 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqvpj\" (UniqueName: \"kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.868543 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:06 crc kubenswrapper[4962]: I0220 10:08:06.487750 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" event={"ID":"15223064-e16f-4407-a15a-2105151aa73f","Type":"ContainerDied","Data":"c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e"} Feb 20 10:08:06 crc kubenswrapper[4962]: I0220 10:08:06.487778 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:08:06 crc kubenswrapper[4962]: I0220 10:08:06.487809 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e" Feb 20 10:08:11 crc kubenswrapper[4962]: I0220 10:08:11.484521 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:11 crc kubenswrapper[4962]: I0220 10:08:11.485699 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:12 crc kubenswrapper[4962]: I0220 10:08:12.545572 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gq5bb" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="registry-server" probeResult="failure" output=< Feb 20 10:08:12 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 10:08:12 crc kubenswrapper[4962]: > Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.720796 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj"] Feb 20 10:08:17 crc kubenswrapper[4962]: E0220 10:08:17.721311 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="extract" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.721326 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="extract" Feb 20 10:08:17 crc kubenswrapper[4962]: E0220 10:08:17.721344 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="util" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.721352 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="util" Feb 20 10:08:17 crc kubenswrapper[4962]: E0220 10:08:17.721362 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="pull" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.721369 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="pull" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.721459 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="extract" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.721863 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.738739 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.739043 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.739205 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-cswgf" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.747119 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.747740 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.754748 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-webhook-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.754873 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph9l7\" (UniqueName: \"kubernetes.io/projected/403ba47d-bbe1-48f6-9382-47f12bbb75ae-kube-api-access-ph9l7\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.755023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-apiservice-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.763913 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj"] Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.856498 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-apiservice-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.856617 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-webhook-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.856641 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph9l7\" (UniqueName: \"kubernetes.io/projected/403ba47d-bbe1-48f6-9382-47f12bbb75ae-kube-api-access-ph9l7\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.869565 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-apiservice-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.869566 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-webhook-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.874456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph9l7\" (UniqueName: \"kubernetes.io/projected/403ba47d-bbe1-48f6-9382-47f12bbb75ae-kube-api-access-ph9l7\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.054997 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.061064 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd"] Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.061961 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.064181 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.064705 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.064752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-tjmf9" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.133222 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd"] Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.160548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-webhook-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.160630 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr6m5\" (UniqueName: \"kubernetes.io/projected/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-kube-api-access-hr6m5\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.160666 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-apiservice-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.261437 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-apiservice-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.262402 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-webhook-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.262433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr6m5\" (UniqueName: \"kubernetes.io/projected/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-kube-api-access-hr6m5\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.267339 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-apiservice-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.283734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr6m5\" (UniqueName: \"kubernetes.io/projected/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-kube-api-access-hr6m5\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.284392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-webhook-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.425135 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.513581 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj"] Feb 20 10:08:18 crc kubenswrapper[4962]: W0220 10:08:18.535741 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod403ba47d_bbe1_48f6_9382_47f12bbb75ae.slice/crio-15c73fa11bd6306362ac9fd9f5f448136c1532c1a28bc23d994a0d3bf4e43f03 WatchSource:0}: Error finding container 15c73fa11bd6306362ac9fd9f5f448136c1532c1a28bc23d994a0d3bf4e43f03: Status 404 returned error can't find the container with id 15c73fa11bd6306362ac9fd9f5f448136c1532c1a28bc23d994a0d3bf4e43f03 Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.579916 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" event={"ID":"403ba47d-bbe1-48f6-9382-47f12bbb75ae","Type":"ContainerStarted","Data":"15c73fa11bd6306362ac9fd9f5f448136c1532c1a28bc23d994a0d3bf4e43f03"} Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.903919 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd"] Feb 20 10:08:18 crc kubenswrapper[4962]: W0220 10:08:18.914323 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ae49f4e_271b_40e8_9cfc_9857fc2de6f3.slice/crio-85eee88bbc5acd22fd7795c374de672756961e04fa4302da66697e96ff02cbe1 WatchSource:0}: Error finding container 85eee88bbc5acd22fd7795c374de672756961e04fa4302da66697e96ff02cbe1: Status 404 returned error can't find the container with id 85eee88bbc5acd22fd7795c374de672756961e04fa4302da66697e96ff02cbe1 Feb 20 10:08:19 crc kubenswrapper[4962]: I0220 10:08:19.595229 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" event={"ID":"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3","Type":"ContainerStarted","Data":"85eee88bbc5acd22fd7795c374de672756961e04fa4302da66697e96ff02cbe1"} Feb 20 10:08:21 crc kubenswrapper[4962]: I0220 10:08:21.527162 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:21 crc kubenswrapper[4962]: I0220 10:08:21.588523 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:21 crc kubenswrapper[4962]: I0220 10:08:21.610064 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" event={"ID":"403ba47d-bbe1-48f6-9382-47f12bbb75ae","Type":"ContainerStarted","Data":"ca63e54f1e75d95b67ff13a5e5f2b314fd34f194df66e73c37fcb9e2815a3ea4"} Feb 20 10:08:21 crc kubenswrapper[4962]: I0220 10:08:21.635034 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" podStartSLOduration=1.8339391539999998 podStartE2EDuration="4.635008708s" podCreationTimestamp="2026-02-20 10:08:17 +0000 UTC" firstStartedPulling="2026-02-20 10:08:18.541645141 +0000 UTC m=+790.124116977" lastFinishedPulling="2026-02-20 10:08:21.342714685 +0000 UTC m=+792.925186531" observedRunningTime="2026-02-20 10:08:21.630750369 +0000 UTC m=+793.213222225" watchObservedRunningTime="2026-02-20 10:08:21.635008708 +0000 UTC m=+793.217480554" Feb 20 10:08:22 crc kubenswrapper[4962]: I0220 10:08:22.618051 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:23 crc kubenswrapper[4962]: I0220 10:08:23.628088 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" event={"ID":"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3","Type":"ContainerStarted","Data":"82f3a2ebf6a19c278b24356ba6079d4de9838d6b6b1315a85e395071abeb8d5c"} Feb 20 10:08:23 crc kubenswrapper[4962]: I0220 10:08:23.653042 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" podStartSLOduration=1.568976641 podStartE2EDuration="5.653020397s" podCreationTimestamp="2026-02-20 10:08:18 +0000 UTC" firstStartedPulling="2026-02-20 10:08:18.918308782 +0000 UTC m=+790.500780638" lastFinishedPulling="2026-02-20 10:08:23.002352528 +0000 UTC m=+794.584824394" observedRunningTime="2026-02-20 10:08:23.649775119 +0000 UTC m=+795.232246975" watchObservedRunningTime="2026-02-20 10:08:23.653020397 +0000 UTC m=+795.235492253" Feb 20 10:08:23 crc kubenswrapper[4962]: I0220 10:08:23.892504 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:23 crc kubenswrapper[4962]: I0220 10:08:23.892748 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gq5bb" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="registry-server" containerID="cri-o://8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8" gracePeriod=2 Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.338980 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.379791 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content\") pod \"2998604a-1adc-4333-9c8a-a4128085b7ce\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.379982 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities\") pod \"2998604a-1adc-4333-9c8a-a4128085b7ce\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.380040 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8ttl\" (UniqueName: \"kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl\") pod \"2998604a-1adc-4333-9c8a-a4128085b7ce\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.382136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities" (OuterVolumeSpecName: "utilities") pod "2998604a-1adc-4333-9c8a-a4128085b7ce" (UID: "2998604a-1adc-4333-9c8a-a4128085b7ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.386763 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl" (OuterVolumeSpecName: "kube-api-access-r8ttl") pod "2998604a-1adc-4333-9c8a-a4128085b7ce" (UID: "2998604a-1adc-4333-9c8a-a4128085b7ce"). InnerVolumeSpecName "kube-api-access-r8ttl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.483984 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8ttl\" (UniqueName: \"kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.484229 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.525537 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2998604a-1adc-4333-9c8a-a4128085b7ce" (UID: "2998604a-1adc-4333-9c8a-a4128085b7ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.585929 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.637864 4962 generic.go:334] "Generic (PLEG): container finished" podID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerID="8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8" exitCode=0 Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.637933 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.637948 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerDied","Data":"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8"} Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.637999 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerDied","Data":"2ceeda16e7e563747039db6289ac67533cb60df02419917452a4675b9ac9c448"} Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.638043 4962 scope.go:117] "RemoveContainer" containerID="8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.638669 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.681799 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.688485 4962 scope.go:117] "RemoveContainer" containerID="daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.700423 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.714715 4962 scope.go:117] "RemoveContainer" containerID="c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.742400 4962 scope.go:117] "RemoveContainer" containerID="8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8" Feb 20 10:08:24 crc kubenswrapper[4962]: E0220 10:08:24.743053 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8\": container with ID starting with 8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8 not found: ID does not exist" containerID="8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.743100 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8"} err="failed to get container status \"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8\": rpc error: code = NotFound desc = could not find container \"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8\": container with ID starting with 8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8 not found: ID does not exist" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.743135 4962 scope.go:117] "RemoveContainer" containerID="daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef" Feb 20 10:08:24 crc kubenswrapper[4962]: E0220 10:08:24.743468 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef\": container with ID starting with daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef not found: ID does not exist" containerID="daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.743490 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef"} err="failed to get container status \"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef\": rpc error: code = NotFound desc = could not find container \"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef\": container with ID starting with daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef not found: ID does not exist" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.743502 4962 scope.go:117] "RemoveContainer" containerID="c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301" Feb 20 10:08:24 crc kubenswrapper[4962]: E0220 10:08:24.743752 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301\": container with ID starting with c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301 not found: ID does not exist" containerID="c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.743774 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301"} err="failed to get container status \"c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301\": rpc error: code = NotFound desc = could not find container \"c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301\": container with ID starting with c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301 not found: ID does not exist" Feb 20 10:08:25 crc kubenswrapper[4962]: I0220 10:08:25.150414 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" path="/var/lib/kubelet/pods/2998604a-1adc-4333-9c8a-a4128085b7ce/volumes" Feb 20 10:08:38 crc kubenswrapper[4962]: I0220 10:08:38.457712 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.059106 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.827019 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-zf82t"] Feb 20 10:08:58 crc kubenswrapper[4962]: E0220 10:08:58.827782 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="extract-utilities" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.827815 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="extract-utilities" Feb 20 10:08:58 crc kubenswrapper[4962]: E0220 10:08:58.827837 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="extract-content" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.827848 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="extract-content" Feb 20 10:08:58 crc kubenswrapper[4962]: E0220 10:08:58.827865 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="registry-server" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.827876 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="registry-server" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.828066 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="registry-server" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.831082 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.835074 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-mrvb4" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.835247 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.837200 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.848398 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m"] Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.849569 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.852738 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.864342 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m"] Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890032 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics-certs\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-reloader\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890121 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66zlq\" (UniqueName: \"kubernetes.io/projected/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-kube-api-access-66zlq\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890141 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzt9w\" (UniqueName: \"kubernetes.io/projected/7135845d-f595-42df-9773-7701c9a0b2e2-kube-api-access-xzt9w\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890216 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890234 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-startup\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890277 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-sockets\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890432 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-conf\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890490 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.924233 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-rx2lw"] Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.925391 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rx2lw" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.930482 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.930700 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-g9t8n" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.931004 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.932006 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.942382 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-29wdn"] Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.948646 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.954424 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-startup\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992800 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metallb-excludel2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992829 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-sockets\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992850 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-conf\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992875 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metrics-certs\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992899 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992959 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h7q4\" (UniqueName: \"kubernetes.io/projected/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-kube-api-access-7h7q4\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992978 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993000 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993029 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-cert\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993050 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics-certs\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993085 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-reloader\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993116 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66zlq\" (UniqueName: \"kubernetes.io/projected/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-kube-api-access-66zlq\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzt9w\" (UniqueName: \"kubernetes.io/projected/7135845d-f595-42df-9773-7701c9a0b2e2-kube-api-access-xzt9w\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993154 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmhx2\" (UniqueName: \"kubernetes.io/projected/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-kube-api-access-tmhx2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993174 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:58 crc kubenswrapper[4962]: E0220 10:08:58.993289 4962 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 20 10:08:58 crc kubenswrapper[4962]: E0220 10:08:58.993347 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert podName:7135845d-f595-42df-9773-7701c9a0b2e2 nodeName:}" failed. No retries permitted until 2026-02-20 10:08:59.493329323 +0000 UTC m=+831.075801169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert") pod "frr-k8s-webhook-server-78b44bf5bb-hb87m" (UID: "7135845d-f595-42df-9773-7701c9a0b2e2") : secret "frr-k8s-webhook-server-cert" not found Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993708 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-sockets\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993901 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-reloader\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993960 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-29wdn"] Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.994005 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-conf\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.994329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-startup\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.996337 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.009090 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics-certs\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.009380 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzt9w\" (UniqueName: \"kubernetes.io/projected/7135845d-f595-42df-9773-7701c9a0b2e2-kube-api-access-xzt9w\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.011363 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66zlq\" (UniqueName: \"kubernetes.io/projected/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-kube-api-access-66zlq\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094442 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmhx2\" (UniqueName: \"kubernetes.io/projected/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-kube-api-access-tmhx2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metallb-excludel2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094568 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metrics-certs\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094637 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h7q4\" (UniqueName: \"kubernetes.io/projected/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-kube-api-access-7h7q4\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094655 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094677 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094701 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-cert\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.094820 4962 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.094922 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs podName:7af7ee52-8865-48ce-85e5-7b62fb0d67d3 nodeName:}" failed. No retries permitted until 2026-02-20 10:08:59.594892813 +0000 UTC m=+831.177364669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs") pod "controller-69bbfbf88f-29wdn" (UID: "7af7ee52-8865-48ce-85e5-7b62fb0d67d3") : secret "controller-certs-secret" not found Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.094962 4962 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.095112 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist podName:c8b5efc7-c8c4-4492-a8a9-31eaecfa8374 nodeName:}" failed. No retries permitted until 2026-02-20 10:08:59.595080808 +0000 UTC m=+831.177552764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist") pod "speaker-rx2lw" (UID: "c8b5efc7-c8c4-4492-a8a9-31eaecfa8374") : secret "metallb-memberlist" not found Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.095279 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metallb-excludel2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.101117 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-cert\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.109981 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metrics-certs\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.114174 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmhx2\" (UniqueName: \"kubernetes.io/projected/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-kube-api-access-tmhx2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.117323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h7q4\" (UniqueName: \"kubernetes.io/projected/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-kube-api-access-7h7q4\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.157534 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.502935 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.510997 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.606806 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.606884 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.607054 4962 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.607164 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist podName:c8b5efc7-c8c4-4492-a8a9-31eaecfa8374 nodeName:}" failed. No retries permitted until 2026-02-20 10:09:00.607136265 +0000 UTC m=+832.189608311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist") pod "speaker-rx2lw" (UID: "c8b5efc7-c8c4-4492-a8a9-31eaecfa8374") : secret "metallb-memberlist" not found Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.612310 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.765535 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.857634 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"88001c889bfe063aa4b9580aa7618db487b62b3d5d5c8321559e84b61c43f59a"} Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.864296 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.211042 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m"] Feb 20 10:09:00 crc kubenswrapper[4962]: W0220 10:09:00.211899 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7135845d_f595_42df_9773_7701c9a0b2e2.slice/crio-8d2d88e7d76f61bc68fdda8fe64da2313a3bf100985cef149b302f8765d17754 WatchSource:0}: Error finding container 8d2d88e7d76f61bc68fdda8fe64da2313a3bf100985cef149b302f8765d17754: Status 404 returned error can't find the container with id 8d2d88e7d76f61bc68fdda8fe64da2313a3bf100985cef149b302f8765d17754 Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.322486 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-29wdn"] Feb 20 10:09:00 crc kubenswrapper[4962]: W0220 10:09:00.332758 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af7ee52_8865_48ce_85e5_7b62fb0d67d3.slice/crio-7f06a781a6c2dc25020e232dd422a30c8758e0e89951b5de20b6fbcd9aad2c39 WatchSource:0}: Error finding container 7f06a781a6c2dc25020e232dd422a30c8758e0e89951b5de20b6fbcd9aad2c39: Status 404 returned error can't find the container with id 7f06a781a6c2dc25020e232dd422a30c8758e0e89951b5de20b6fbcd9aad2c39 Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.619831 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.625675 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.743361 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rx2lw" Feb 20 10:09:00 crc kubenswrapper[4962]: W0220 10:09:00.769444 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b5efc7_c8c4_4492_a8a9_31eaecfa8374.slice/crio-53b5611abf50fb4282b4b407ac315d043c6e0565d56b8fe7d179e0cac58ccbae WatchSource:0}: Error finding container 53b5611abf50fb4282b4b407ac315d043c6e0565d56b8fe7d179e0cac58ccbae: Status 404 returned error can't find the container with id 53b5611abf50fb4282b4b407ac315d043c6e0565d56b8fe7d179e0cac58ccbae Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.866849 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-29wdn" event={"ID":"7af7ee52-8865-48ce-85e5-7b62fb0d67d3","Type":"ContainerStarted","Data":"573220f51e873ebd9a38f2bb1b436efcadf5368d005fc13f5d3fc5b28e0c1024"} Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.866919 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-29wdn" event={"ID":"7af7ee52-8865-48ce-85e5-7b62fb0d67d3","Type":"ContainerStarted","Data":"0e4f4f5b504f209130a89ed33701043c3e959c6d3f5bd517720c6f71e47d8e68"} Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.866937 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-29wdn" event={"ID":"7af7ee52-8865-48ce-85e5-7b62fb0d67d3","Type":"ContainerStarted","Data":"7f06a781a6c2dc25020e232dd422a30c8758e0e89951b5de20b6fbcd9aad2c39"} Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.866990 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.869461 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rx2lw" event={"ID":"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374","Type":"ContainerStarted","Data":"53b5611abf50fb4282b4b407ac315d043c6e0565d56b8fe7d179e0cac58ccbae"} Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.871260 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" event={"ID":"7135845d-f595-42df-9773-7701c9a0b2e2","Type":"ContainerStarted","Data":"8d2d88e7d76f61bc68fdda8fe64da2313a3bf100985cef149b302f8765d17754"} Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.888760 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-29wdn" podStartSLOduration=2.888732746 podStartE2EDuration="2.888732746s" podCreationTimestamp="2026-02-20 10:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:09:00.881938889 +0000 UTC m=+832.464410755" watchObservedRunningTime="2026-02-20 10:09:00.888732746 +0000 UTC m=+832.471204592" Feb 20 10:09:01 crc kubenswrapper[4962]: I0220 10:09:01.903669 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rx2lw" event={"ID":"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374","Type":"ContainerStarted","Data":"37cb2dbb2252525197eeb148128697f9a359ceef5ca33e7792293725986d53b3"} Feb 20 10:09:01 crc kubenswrapper[4962]: I0220 10:09:01.903940 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rx2lw" event={"ID":"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374","Type":"ContainerStarted","Data":"d28244abb1b7d99b98d17d0d2301f797137939e540cead347e8472899ceb4720"} Feb 20 10:09:01 crc kubenswrapper[4962]: I0220 10:09:01.904846 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-rx2lw" Feb 20 10:09:01 crc kubenswrapper[4962]: I0220 10:09:01.936056 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-rx2lw" podStartSLOduration=3.936038302 podStartE2EDuration="3.936038302s" podCreationTimestamp="2026-02-20 10:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:09:01.935544157 +0000 UTC m=+833.518016003" watchObservedRunningTime="2026-02-20 10:09:01.936038302 +0000 UTC m=+833.518510148" Feb 20 10:09:06 crc kubenswrapper[4962]: I0220 10:09:06.961031 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" event={"ID":"7135845d-f595-42df-9773-7701c9a0b2e2","Type":"ContainerStarted","Data":"81cd0f3e683d45f7fa282b87fd7e0f42002369ac374a976748add17cc626018f"} Feb 20 10:09:06 crc kubenswrapper[4962]: I0220 10:09:06.961763 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:09:06 crc kubenswrapper[4962]: I0220 10:09:06.963618 4962 generic.go:334] "Generic (PLEG): container finished" podID="3eb8e16a-ffc3-4756-a3ee-96473eecf85d" containerID="6cea3ca8b602ce4569b14e9517c3e36ed0bc21260e7a380fb219062b228d7f8c" exitCode=0 Feb 20 10:09:06 crc kubenswrapper[4962]: I0220 10:09:06.963682 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerDied","Data":"6cea3ca8b602ce4569b14e9517c3e36ed0bc21260e7a380fb219062b228d7f8c"} Feb 20 10:09:06 crc kubenswrapper[4962]: I0220 10:09:06.989432 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" podStartSLOduration=2.730846729 podStartE2EDuration="8.98941483s" podCreationTimestamp="2026-02-20 10:08:58 +0000 UTC" firstStartedPulling="2026-02-20 10:09:00.214386559 +0000 UTC m=+831.796858405" lastFinishedPulling="2026-02-20 10:09:06.47295466 +0000 UTC m=+838.055426506" observedRunningTime="2026-02-20 10:09:06.984973375 +0000 UTC m=+838.567445221" watchObservedRunningTime="2026-02-20 10:09:06.98941483 +0000 UTC m=+838.571886666" Feb 20 10:09:07 crc kubenswrapper[4962]: I0220 10:09:07.972895 4962 generic.go:334] "Generic (PLEG): container finished" podID="3eb8e16a-ffc3-4756-a3ee-96473eecf85d" containerID="886c4bacfd41c907d3736bcbd045ec59d695c6b426c64992c62e0b0c83dc625b" exitCode=0 Feb 20 10:09:07 crc kubenswrapper[4962]: I0220 10:09:07.972985 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerDied","Data":"886c4bacfd41c907d3736bcbd045ec59d695c6b426c64992c62e0b0c83dc625b"} Feb 20 10:09:08 crc kubenswrapper[4962]: I0220 10:09:08.988992 4962 generic.go:334] "Generic (PLEG): container finished" podID="3eb8e16a-ffc3-4756-a3ee-96473eecf85d" containerID="8d103a0e98d7a3ece8599bebac16aff98d0de848875efe9471b4c419d46dfeaa" exitCode=0 Feb 20 10:09:08 crc kubenswrapper[4962]: I0220 10:09:08.989068 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerDied","Data":"8d103a0e98d7a3ece8599bebac16aff98d0de848875efe9471b4c419d46dfeaa"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.010143 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"40616a045725a9afa05b8488394bad8c6195fdf8f1f6ee164c766cd56717cee4"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.010685 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"c48504fd838e904c7b1daad5c78619d61ec2136d7312c27042b01a98bef0fb8c"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.010696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"2f9b7b3f82f874d5e373b8865caf55f62599fe461f368ef20a048b541beab728"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.010704 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"d9b5b9275fc2bd36861964938112a3904a31bd2b563c1c792949a36580d805d0"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.010713 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"e0fa5df594f5b3065d16fca7a01ffee712861bd476ce54376311ec19c8621228"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.748423 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-rx2lw" Feb 20 10:09:11 crc kubenswrapper[4962]: I0220 10:09:11.024764 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"49ae5aadd012b574e8b8bdfd11b85cf59cc7613ee4962ea13e534d14fe466c03"} Feb 20 10:09:11 crc kubenswrapper[4962]: I0220 10:09:11.025090 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:09:11 crc kubenswrapper[4962]: I0220 10:09:11.063808 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-zf82t" podStartSLOduration=5.884607316 podStartE2EDuration="13.063776032s" podCreationTimestamp="2026-02-20 10:08:58 +0000 UTC" firstStartedPulling="2026-02-20 10:08:59.285343058 +0000 UTC m=+830.867814904" lastFinishedPulling="2026-02-20 10:09:06.464511774 +0000 UTC m=+838.046983620" observedRunningTime="2026-02-20 10:09:11.060215205 +0000 UTC m=+842.642687091" watchObservedRunningTime="2026-02-20 10:09:11.063776032 +0000 UTC m=+842.646247908" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.622614 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z"] Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.623837 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.625951 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.673541 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z"] Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.718224 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.718306 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5vl9\" (UniqueName: \"kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.718395 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.820055 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.820131 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.820162 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5vl9\" (UniqueName: \"kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.820583 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.820824 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.847702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5vl9\" (UniqueName: \"kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.938473 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:13 crc kubenswrapper[4962]: I0220 10:09:13.148954 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z"] Feb 20 10:09:14 crc kubenswrapper[4962]: I0220 10:09:14.050980 4962 generic.go:334] "Generic (PLEG): container finished" podID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerID="4aeee85e2c14d3a7fe2f8c49af3f661e1408188a571d437d5da3e2e875197af8" exitCode=0 Feb 20 10:09:14 crc kubenswrapper[4962]: I0220 10:09:14.051125 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" event={"ID":"9536c987-ff07-45d5-b8c8-12cfe3019427","Type":"ContainerDied","Data":"4aeee85e2c14d3a7fe2f8c49af3f661e1408188a571d437d5da3e2e875197af8"} Feb 20 10:09:14 crc kubenswrapper[4962]: I0220 10:09:14.052100 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" event={"ID":"9536c987-ff07-45d5-b8c8-12cfe3019427","Type":"ContainerStarted","Data":"a83ddc1d1dbabac527837db39de85778c0930ece92079843f18edda6431e7e29"} Feb 20 10:09:14 crc kubenswrapper[4962]: I0220 10:09:14.158222 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:09:14 crc kubenswrapper[4962]: I0220 10:09:14.216376 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:09:19 crc kubenswrapper[4962]: I0220 10:09:19.090489 4962 generic.go:334] "Generic (PLEG): container finished" podID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerID="6b15885b4927d9d19a457c546d9edc1cfe227be824780684ea21ca71fd55db66" exitCode=0 Feb 20 10:09:19 crc kubenswrapper[4962]: I0220 10:09:19.090705 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" event={"ID":"9536c987-ff07-45d5-b8c8-12cfe3019427","Type":"ContainerDied","Data":"6b15885b4927d9d19a457c546d9edc1cfe227be824780684ea21ca71fd55db66"} Feb 20 10:09:19 crc kubenswrapper[4962]: I0220 10:09:19.165312 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:09:19 crc kubenswrapper[4962]: I0220 10:09:19.774663 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:09:19 crc kubenswrapper[4962]: I0220 10:09:19.870923 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:09:20 crc kubenswrapper[4962]: I0220 10:09:20.105668 4962 generic.go:334] "Generic (PLEG): container finished" podID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerID="1d9fa17133e4014eadec9971d9d94641bdabf9b142de77f71866cc7b4c033342" exitCode=0 Feb 20 10:09:20 crc kubenswrapper[4962]: I0220 10:09:20.105787 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" event={"ID":"9536c987-ff07-45d5-b8c8-12cfe3019427","Type":"ContainerDied","Data":"1d9fa17133e4014eadec9971d9d94641bdabf9b142de77f71866cc7b4c033342"} Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.442761 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.564941 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle\") pod \"9536c987-ff07-45d5-b8c8-12cfe3019427\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.565154 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5vl9\" (UniqueName: \"kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9\") pod \"9536c987-ff07-45d5-b8c8-12cfe3019427\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.565184 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util\") pod \"9536c987-ff07-45d5-b8c8-12cfe3019427\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.566233 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle" (OuterVolumeSpecName: "bundle") pod "9536c987-ff07-45d5-b8c8-12cfe3019427" (UID: "9536c987-ff07-45d5-b8c8-12cfe3019427"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.573757 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9" (OuterVolumeSpecName: "kube-api-access-l5vl9") pod "9536c987-ff07-45d5-b8c8-12cfe3019427" (UID: "9536c987-ff07-45d5-b8c8-12cfe3019427"). InnerVolumeSpecName "kube-api-access-l5vl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.586776 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util" (OuterVolumeSpecName: "util") pod "9536c987-ff07-45d5-b8c8-12cfe3019427" (UID: "9536c987-ff07-45d5-b8c8-12cfe3019427"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.668266 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5vl9\" (UniqueName: \"kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9\") on node \"crc\" DevicePath \"\"" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.668638 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util\") on node \"crc\" DevicePath \"\"" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.668649 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:09:22 crc kubenswrapper[4962]: I0220 10:09:22.127284 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" event={"ID":"9536c987-ff07-45d5-b8c8-12cfe3019427","Type":"ContainerDied","Data":"a83ddc1d1dbabac527837db39de85778c0930ece92079843f18edda6431e7e29"} Feb 20 10:09:22 crc kubenswrapper[4962]: I0220 10:09:22.127356 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a83ddc1d1dbabac527837db39de85778c0930ece92079843f18edda6431e7e29" Feb 20 10:09:22 crc kubenswrapper[4962]: I0220 10:09:22.127410 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.780131 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w"] Feb 20 10:09:26 crc kubenswrapper[4962]: E0220 10:09:26.780827 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="extract" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.780847 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="extract" Feb 20 10:09:26 crc kubenswrapper[4962]: E0220 10:09:26.780879 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="pull" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.780887 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="pull" Feb 20 10:09:26 crc kubenswrapper[4962]: E0220 10:09:26.780899 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="util" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.780906 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="util" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.781027 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="extract" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.781555 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.784923 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.786719 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-fsqpv" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.789252 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.845126 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w"] Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.947673 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxmk\" (UniqueName: \"kubernetes.io/projected/74abbb4a-2e6c-459a-8646-28b2519ca98a-kube-api-access-vsxmk\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.947742 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74abbb4a-2e6c-459a-8646-28b2519ca98a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.049888 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74abbb4a-2e6c-459a-8646-28b2519ca98a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.050193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxmk\" (UniqueName: \"kubernetes.io/projected/74abbb4a-2e6c-459a-8646-28b2519ca98a-kube-api-access-vsxmk\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.050714 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74abbb4a-2e6c-459a-8646-28b2519ca98a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.077511 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxmk\" (UniqueName: \"kubernetes.io/projected/74abbb4a-2e6c-459a-8646-28b2519ca98a-kube-api-access-vsxmk\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.097888 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.540520 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w"] Feb 20 10:09:27 crc kubenswrapper[4962]: W0220 10:09:27.551509 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74abbb4a_2e6c_459a_8646_28b2519ca98a.slice/crio-e0db397a7d6c731596da26dd7fa014ab2c07bae7626c76ecb1e61460dc6efa2e WatchSource:0}: Error finding container e0db397a7d6c731596da26dd7fa014ab2c07bae7626c76ecb1e61460dc6efa2e: Status 404 returned error can't find the container with id e0db397a7d6c731596da26dd7fa014ab2c07bae7626c76ecb1e61460dc6efa2e Feb 20 10:09:28 crc kubenswrapper[4962]: I0220 10:09:28.189575 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" event={"ID":"74abbb4a-2e6c-459a-8646-28b2519ca98a","Type":"ContainerStarted","Data":"e0db397a7d6c731596da26dd7fa014ab2c07bae7626c76ecb1e61460dc6efa2e"} Feb 20 10:09:32 crc kubenswrapper[4962]: I0220 10:09:32.226389 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" event={"ID":"74abbb4a-2e6c-459a-8646-28b2519ca98a","Type":"ContainerStarted","Data":"c6efaed722eff0e6ecc55d72bb48f6b38d1ec5846343e7e3f09fc5b3b3d35a5b"} Feb 20 10:09:32 crc kubenswrapper[4962]: I0220 10:09:32.253047 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" podStartSLOduration=2.715469846 podStartE2EDuration="6.253027125s" podCreationTimestamp="2026-02-20 10:09:26 +0000 UTC" firstStartedPulling="2026-02-20 10:09:27.55482445 +0000 UTC m=+859.137296296" lastFinishedPulling="2026-02-20 10:09:31.092381719 +0000 UTC m=+862.674853575" observedRunningTime="2026-02-20 10:09:32.249311804 +0000 UTC m=+863.831783650" watchObservedRunningTime="2026-02-20 10:09:32.253027125 +0000 UTC m=+863.835498971" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.072227 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-t5nv4"] Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.073439 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.075935 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.077937 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-n7fx8" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.083377 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.083963 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-t5nv4"] Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.181534 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqf7j\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-kube-api-access-qqf7j\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.181657 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.283256 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqf7j\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-kube-api-access-qqf7j\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.283401 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.301258 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.302192 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqf7j\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-kube-api-access-qqf7j\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.389624 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.813971 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-t5nv4"] Feb 20 10:09:36 crc kubenswrapper[4962]: I0220 10:09:36.251138 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" event={"ID":"0d86f751-d081-47b7-a623-a9cc14ab43f7","Type":"ContainerStarted","Data":"87c431b9e2a9770b37d6b34a97b11944b5042bd6cd73b84af91fd09b7ca9b405"} Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.357675 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lvkdh"] Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.358917 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.361900 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wnw7f" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.373153 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lvkdh"] Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.517384 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg8cd\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-kube-api-access-wg8cd\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.517438 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.619194 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg8cd\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-kube-api-access-wg8cd\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.619251 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.685737 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg8cd\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-kube-api-access-wg8cd\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.686531 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.986314 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:38 crc kubenswrapper[4962]: I0220 10:09:38.238111 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lvkdh"] Feb 20 10:09:38 crc kubenswrapper[4962]: I0220 10:09:38.268262 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" event={"ID":"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3","Type":"ContainerStarted","Data":"2b17b3521afe03e996822dd9946a92c6704e68baedc01dcb608100b67f0b1aa1"} Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.288906 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" event={"ID":"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3","Type":"ContainerStarted","Data":"06a45cebf86a3e6873275b4afae9441485ff21c7e7aab0713828afea39bb6a78"} Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.291522 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" event={"ID":"0d86f751-d081-47b7-a623-a9cc14ab43f7","Type":"ContainerStarted","Data":"126d53205b09856ec052f966183bef2386f5de5d7433b4c517ad8d0e9e1008b2"} Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.292083 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.314956 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" podStartSLOduration=1.792623007 podStartE2EDuration="4.31493072s" podCreationTimestamp="2026-02-20 10:09:37 +0000 UTC" firstStartedPulling="2026-02-20 10:09:38.249585345 +0000 UTC m=+869.832057191" lastFinishedPulling="2026-02-20 10:09:40.771893068 +0000 UTC m=+872.354364904" observedRunningTime="2026-02-20 10:09:41.308179458 +0000 UTC m=+872.890651314" watchObservedRunningTime="2026-02-20 10:09:41.31493072 +0000 UTC m=+872.897402586" Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.343670 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" podStartSLOduration=1.389267471 podStartE2EDuration="6.343645049s" podCreationTimestamp="2026-02-20 10:09:35 +0000 UTC" firstStartedPulling="2026-02-20 10:09:35.823854189 +0000 UTC m=+867.406326035" lastFinishedPulling="2026-02-20 10:09:40.778231767 +0000 UTC m=+872.360703613" observedRunningTime="2026-02-20 10:09:41.343310519 +0000 UTC m=+872.925782395" watchObservedRunningTime="2026-02-20 10:09:41.343645049 +0000 UTC m=+872.926116905" Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.507816 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.507891 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.207227 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-ctc7p"] Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.209248 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.211960 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bc9pb" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.222708 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-ctc7p"] Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.341816 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhbr8\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-kube-api-access-jhbr8\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.341874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-bound-sa-token\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.395289 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.443051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhbr8\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-kube-api-access-jhbr8\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.443155 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-bound-sa-token\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.469283 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-bound-sa-token\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.482740 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhbr8\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-kube-api-access-jhbr8\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.526156 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.976938 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-ctc7p"] Feb 20 10:09:46 crc kubenswrapper[4962]: I0220 10:09:46.328778 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-ctc7p" event={"ID":"41c6ef1c-4069-44b1-a0ba-de5e820a630c","Type":"ContainerStarted","Data":"3cfa1d1bb027eaebb8a5d6e4408907632e040da623c8f1bf1860cc2ee622c7e2"} Feb 20 10:09:46 crc kubenswrapper[4962]: I0220 10:09:46.329312 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-ctc7p" event={"ID":"41c6ef1c-4069-44b1-a0ba-de5e820a630c","Type":"ContainerStarted","Data":"179085b91e733d55c7c4f421cf080d8c3f1001811031e19a46d6187d0fb17bda"} Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.386773 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-ctc7p" podStartSLOduration=4.386743671 podStartE2EDuration="4.386743671s" podCreationTimestamp="2026-02-20 10:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:09:46.368982589 +0000 UTC m=+877.951454515" watchObservedRunningTime="2026-02-20 10:09:49.386743671 +0000 UTC m=+880.969215527" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.392360 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.393422 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.397802 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.397821 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-shvls" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.398076 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.409408 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.514694 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn9ts\" (UniqueName: \"kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts\") pod \"openstack-operator-index-dtbt8\" (UID: \"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd\") " pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.617171 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9ts\" (UniqueName: \"kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts\") pod \"openstack-operator-index-dtbt8\" (UID: \"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd\") " pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.639575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn9ts\" (UniqueName: \"kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts\") pod \"openstack-operator-index-dtbt8\" (UID: \"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd\") " pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.758463 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:50 crc kubenswrapper[4962]: I0220 10:09:50.000310 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:50 crc kubenswrapper[4962]: I0220 10:09:50.361917 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dtbt8" event={"ID":"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd","Type":"ContainerStarted","Data":"3852b8283668d47a9524139d85fe1441b979672d19315ed7bbee812f03c2018e"} Feb 20 10:09:52 crc kubenswrapper[4962]: I0220 10:09:52.382526 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dtbt8" event={"ID":"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd","Type":"ContainerStarted","Data":"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae"} Feb 20 10:09:52 crc kubenswrapper[4962]: I0220 10:09:52.412403 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dtbt8" podStartSLOduration=2.537836082 podStartE2EDuration="3.41237337s" podCreationTimestamp="2026-02-20 10:09:49 +0000 UTC" firstStartedPulling="2026-02-20 10:09:50.014850519 +0000 UTC m=+881.597322365" lastFinishedPulling="2026-02-20 10:09:50.889387807 +0000 UTC m=+882.471859653" observedRunningTime="2026-02-20 10:09:52.405342259 +0000 UTC m=+883.987814135" watchObservedRunningTime="2026-02-20 10:09:52.41237337 +0000 UTC m=+883.994845256" Feb 20 10:09:53 crc kubenswrapper[4962]: I0220 10:09:53.949790 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.408024 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dtbt8" podUID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" containerName="registry-server" containerID="cri-o://55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae" gracePeriod=2 Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.788507 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-t9zxk"] Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.790487 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t9zxk"] Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.790645 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.909223 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tkk4\" (UniqueName: \"kubernetes.io/projected/46f437ac-c97a-4af9-92e7-6bec63b7d8d8-kube-api-access-8tkk4\") pod \"openstack-operator-index-t9zxk\" (UID: \"46f437ac-c97a-4af9-92e7-6bec63b7d8d8\") " pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.911283 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.011200 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn9ts\" (UniqueName: \"kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts\") pod \"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd\" (UID: \"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd\") " Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.011794 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tkk4\" (UniqueName: \"kubernetes.io/projected/46f437ac-c97a-4af9-92e7-6bec63b7d8d8-kube-api-access-8tkk4\") pod \"openstack-operator-index-t9zxk\" (UID: \"46f437ac-c97a-4af9-92e7-6bec63b7d8d8\") " pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.023710 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts" (OuterVolumeSpecName: "kube-api-access-nn9ts") pod "c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" (UID: "c3bc0b8f-a3b2-4549-aa20-dc609d7965fd"). InnerVolumeSpecName "kube-api-access-nn9ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.036124 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tkk4\" (UniqueName: \"kubernetes.io/projected/46f437ac-c97a-4af9-92e7-6bec63b7d8d8-kube-api-access-8tkk4\") pod \"openstack-operator-index-t9zxk\" (UID: \"46f437ac-c97a-4af9-92e7-6bec63b7d8d8\") " pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.113616 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn9ts\" (UniqueName: \"kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts\") on node \"crc\" DevicePath \"\"" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.123993 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.374883 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t9zxk"] Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.423642 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t9zxk" event={"ID":"46f437ac-c97a-4af9-92e7-6bec63b7d8d8","Type":"ContainerStarted","Data":"818c2adbc401eff459d353ae2c94eb60c1b43885ba912f1729e7826b8c860b79"} Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.428390 4962 generic.go:334] "Generic (PLEG): container finished" podID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" containerID="55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae" exitCode=0 Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.428429 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dtbt8" event={"ID":"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd","Type":"ContainerDied","Data":"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae"} Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.428463 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dtbt8" event={"ID":"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd","Type":"ContainerDied","Data":"3852b8283668d47a9524139d85fe1441b979672d19315ed7bbee812f03c2018e"} Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.428488 4962 scope.go:117] "RemoveContainer" containerID="55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.428521 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.456138 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.462633 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.468321 4962 scope.go:117] "RemoveContainer" containerID="55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae" Feb 20 10:09:55 crc kubenswrapper[4962]: E0220 10:09:55.468888 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae\": container with ID starting with 55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae not found: ID does not exist" containerID="55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.468970 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae"} err="failed to get container status \"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae\": rpc error: code = NotFound desc = could not find container \"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae\": container with ID starting with 55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae not found: ID does not exist" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.439905 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t9zxk" event={"ID":"46f437ac-c97a-4af9-92e7-6bec63b7d8d8","Type":"ContainerStarted","Data":"80d7893ce3e78f46e78f8e3a04c9130047c485f6f805e5e8a4af01ba647e8461"} Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.464716 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-t9zxk" podStartSLOduration=1.819098275 podStartE2EDuration="2.464663715s" podCreationTimestamp="2026-02-20 10:09:54 +0000 UTC" firstStartedPulling="2026-02-20 10:09:55.392445514 +0000 UTC m=+886.974917400" lastFinishedPulling="2026-02-20 10:09:56.038010994 +0000 UTC m=+887.620482840" observedRunningTime="2026-02-20 10:09:56.45779669 +0000 UTC m=+888.040268606" watchObservedRunningTime="2026-02-20 10:09:56.464663715 +0000 UTC m=+888.047135601" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.565720 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:09:56 crc kubenswrapper[4962]: E0220 10:09:56.566178 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" containerName="registry-server" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.566202 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" containerName="registry-server" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.566439 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" containerName="registry-server" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.568582 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.589786 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.637328 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.637478 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.637562 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm84x\" (UniqueName: \"kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.739457 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.739543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.739631 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm84x\" (UniqueName: \"kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.740049 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.740209 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.763912 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm84x\" (UniqueName: \"kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.892214 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:57 crc kubenswrapper[4962]: I0220 10:09:57.148034 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" path="/var/lib/kubelet/pods/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd/volumes" Feb 20 10:09:57 crc kubenswrapper[4962]: I0220 10:09:57.162508 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:09:57 crc kubenswrapper[4962]: W0220 10:09:57.167247 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f794364_dcf5_4d81_9edd_69f7a415540c.slice/crio-1b7e213f7e180d7ddad591828ff5c286e7b32af8039116fdf69cf90007ff6bff WatchSource:0}: Error finding container 1b7e213f7e180d7ddad591828ff5c286e7b32af8039116fdf69cf90007ff6bff: Status 404 returned error can't find the container with id 1b7e213f7e180d7ddad591828ff5c286e7b32af8039116fdf69cf90007ff6bff Feb 20 10:09:57 crc kubenswrapper[4962]: I0220 10:09:57.451115 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerID="6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70" exitCode=0 Feb 20 10:09:57 crc kubenswrapper[4962]: I0220 10:09:57.451238 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerDied","Data":"6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70"} Feb 20 10:09:57 crc kubenswrapper[4962]: I0220 10:09:57.451587 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerStarted","Data":"1b7e213f7e180d7ddad591828ff5c286e7b32af8039116fdf69cf90007ff6bff"} Feb 20 10:09:58 crc kubenswrapper[4962]: I0220 10:09:58.463017 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerID="da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8" exitCode=0 Feb 20 10:09:58 crc kubenswrapper[4962]: I0220 10:09:58.463082 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerDied","Data":"da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8"} Feb 20 10:09:59 crc kubenswrapper[4962]: I0220 10:09:59.479020 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerStarted","Data":"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5"} Feb 20 10:09:59 crc kubenswrapper[4962]: I0220 10:09:59.512413 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bmc5d" podStartSLOduration=2.099817244 podStartE2EDuration="3.512377784s" podCreationTimestamp="2026-02-20 10:09:56 +0000 UTC" firstStartedPulling="2026-02-20 10:09:57.45411534 +0000 UTC m=+889.036587186" lastFinishedPulling="2026-02-20 10:09:58.86667585 +0000 UTC m=+890.449147726" observedRunningTime="2026-02-20 10:09:59.505983162 +0000 UTC m=+891.088455048" watchObservedRunningTime="2026-02-20 10:09:59.512377784 +0000 UTC m=+891.094849670" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.770371 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.774231 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.788471 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.888833 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.888903 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.888949 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtrc7\" (UniqueName: \"kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.991699 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.991784 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.991832 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtrc7\" (UniqueName: \"kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.992611 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.992613 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:04 crc kubenswrapper[4962]: I0220 10:10:04.025620 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtrc7\" (UniqueName: \"kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:04 crc kubenswrapper[4962]: I0220 10:10:04.108755 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:04 crc kubenswrapper[4962]: I0220 10:10:04.450640 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:04 crc kubenswrapper[4962]: I0220 10:10:04.527238 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerStarted","Data":"798745ded523c335a02cd6d817703b0235b24b33fe3407309e3c81f693c97266"} Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.125127 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.125190 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.165888 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.534913 4962 generic.go:334] "Generic (PLEG): container finished" podID="05518aab-48c4-4826-89d9-080858755a80" containerID="3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76" exitCode=0 Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.535000 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerDied","Data":"3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76"} Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.570447 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:10:06 crc kubenswrapper[4962]: I0220 10:10:06.542640 4962 generic.go:334] "Generic (PLEG): container finished" podID="05518aab-48c4-4826-89d9-080858755a80" containerID="d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc" exitCode=0 Feb 20 10:10:06 crc kubenswrapper[4962]: I0220 10:10:06.542686 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerDied","Data":"d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc"} Feb 20 10:10:06 crc kubenswrapper[4962]: I0220 10:10:06.892537 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:06 crc kubenswrapper[4962]: I0220 10:10:06.892856 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:06 crc kubenswrapper[4962]: I0220 10:10:06.969341 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:07 crc kubenswrapper[4962]: I0220 10:10:07.553821 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerStarted","Data":"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e"} Feb 20 10:10:07 crc kubenswrapper[4962]: I0220 10:10:07.573782 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q557r" podStartSLOduration=3.11367405 podStartE2EDuration="4.573760513s" podCreationTimestamp="2026-02-20 10:10:03 +0000 UTC" firstStartedPulling="2026-02-20 10:10:05.538165027 +0000 UTC m=+897.120636873" lastFinishedPulling="2026-02-20 10:10:06.99825148 +0000 UTC m=+898.580723336" observedRunningTime="2026-02-20 10:10:07.57368438 +0000 UTC m=+899.156156246" watchObservedRunningTime="2026-02-20 10:10:07.573760513 +0000 UTC m=+899.156232359" Feb 20 10:10:07 crc kubenswrapper[4962]: I0220 10:10:07.627026 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:09 crc kubenswrapper[4962]: I0220 10:10:09.552397 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:10:10 crc kubenswrapper[4962]: I0220 10:10:10.573765 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bmc5d" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="registry-server" containerID="cri-o://0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5" gracePeriod=2 Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.040707 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.141483 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities\") pod \"7f794364-dcf5-4d81-9edd-69f7a415540c\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.141619 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content\") pod \"7f794364-dcf5-4d81-9edd-69f7a415540c\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.141655 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm84x\" (UniqueName: \"kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x\") pod \"7f794364-dcf5-4d81-9edd-69f7a415540c\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.142878 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities" (OuterVolumeSpecName: "utilities") pod "7f794364-dcf5-4d81-9edd-69f7a415540c" (UID: "7f794364-dcf5-4d81-9edd-69f7a415540c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.149802 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x" (OuterVolumeSpecName: "kube-api-access-qm84x") pod "7f794364-dcf5-4d81-9edd-69f7a415540c" (UID: "7f794364-dcf5-4d81-9edd-69f7a415540c"). InnerVolumeSpecName "kube-api-access-qm84x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.179093 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f794364-dcf5-4d81-9edd-69f7a415540c" (UID: "7f794364-dcf5-4d81-9edd-69f7a415540c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.243001 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.243041 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm84x\" (UniqueName: \"kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.243053 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.507904 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.507996 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.585564 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerID="0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5" exitCode=0 Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.585731 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerDied","Data":"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5"} Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.585809 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerDied","Data":"1b7e213f7e180d7ddad591828ff5c286e7b32af8039116fdf69cf90007ff6bff"} Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.585849 4962 scope.go:117] "RemoveContainer" containerID="0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.586087 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.628808 4962 scope.go:117] "RemoveContainer" containerID="da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.646824 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.658249 4962 scope.go:117] "RemoveContainer" containerID="6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.661328 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.699356 4962 scope.go:117] "RemoveContainer" containerID="0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5" Feb 20 10:10:11 crc kubenswrapper[4962]: E0220 10:10:11.699823 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5\": container with ID starting with 0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5 not found: ID does not exist" containerID="0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.699946 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5"} err="failed to get container status \"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5\": rpc error: code = NotFound desc = could not find container \"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5\": container with ID starting with 0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5 not found: ID does not exist" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.700039 4962 scope.go:117] "RemoveContainer" containerID="da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8" Feb 20 10:10:11 crc kubenswrapper[4962]: E0220 10:10:11.700282 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8\": container with ID starting with da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8 not found: ID does not exist" containerID="da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.700358 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8"} err="failed to get container status \"da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8\": rpc error: code = NotFound desc = could not find container \"da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8\": container with ID starting with da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8 not found: ID does not exist" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.700427 4962 scope.go:117] "RemoveContainer" containerID="6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70" Feb 20 10:10:11 crc kubenswrapper[4962]: E0220 10:10:11.700871 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70\": container with ID starting with 6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70 not found: ID does not exist" containerID="6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.700966 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70"} err="failed to get container status \"6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70\": rpc error: code = NotFound desc = could not find container \"6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70\": container with ID starting with 6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70 not found: ID does not exist" Feb 20 10:10:13 crc kubenswrapper[4962]: I0220 10:10:13.156300 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" path="/var/lib/kubelet/pods/7f794364-dcf5-4d81-9edd-69f7a415540c/volumes" Feb 20 10:10:14 crc kubenswrapper[4962]: I0220 10:10:14.109529 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:14 crc kubenswrapper[4962]: I0220 10:10:14.109628 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:14 crc kubenswrapper[4962]: I0220 10:10:14.161379 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:14 crc kubenswrapper[4962]: I0220 10:10:14.659244 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:15 crc kubenswrapper[4962]: I0220 10:10:15.151375 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:16 crc kubenswrapper[4962]: I0220 10:10:16.626490 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q557r" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="registry-server" containerID="cri-o://df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e" gracePeriod=2 Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.204750 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.353525 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtrc7\" (UniqueName: \"kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7\") pod \"05518aab-48c4-4826-89d9-080858755a80\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.353620 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content\") pod \"05518aab-48c4-4826-89d9-080858755a80\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.353666 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities\") pod \"05518aab-48c4-4826-89d9-080858755a80\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.354932 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities" (OuterVolumeSpecName: "utilities") pod "05518aab-48c4-4826-89d9-080858755a80" (UID: "05518aab-48c4-4826-89d9-080858755a80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.363946 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7" (OuterVolumeSpecName: "kube-api-access-mtrc7") pod "05518aab-48c4-4826-89d9-080858755a80" (UID: "05518aab-48c4-4826-89d9-080858755a80"). InnerVolumeSpecName "kube-api-access-mtrc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.455578 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtrc7\" (UniqueName: \"kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.455647 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.644356 4962 generic.go:334] "Generic (PLEG): container finished" podID="05518aab-48c4-4826-89d9-080858755a80" containerID="df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e" exitCode=0 Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.644429 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerDied","Data":"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e"} Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.644472 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerDied","Data":"798745ded523c335a02cd6d817703b0235b24b33fe3407309e3c81f693c97266"} Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.644528 4962 scope.go:117] "RemoveContainer" containerID="df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.644749 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.678891 4962 scope.go:117] "RemoveContainer" containerID="d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.714279 4962 scope.go:117] "RemoveContainer" containerID="3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.742729 4962 scope.go:117] "RemoveContainer" containerID="df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e" Feb 20 10:10:17 crc kubenswrapper[4962]: E0220 10:10:17.743612 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e\": container with ID starting with df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e not found: ID does not exist" containerID="df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.743670 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e"} err="failed to get container status \"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e\": rpc error: code = NotFound desc = could not find container \"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e\": container with ID starting with df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e not found: ID does not exist" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.743709 4962 scope.go:117] "RemoveContainer" containerID="d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc" Feb 20 10:10:17 crc kubenswrapper[4962]: E0220 10:10:17.744437 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc\": container with ID starting with d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc not found: ID does not exist" containerID="d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.744577 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc"} err="failed to get container status \"d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc\": rpc error: code = NotFound desc = could not find container \"d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc\": container with ID starting with d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc not found: ID does not exist" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.744678 4962 scope.go:117] "RemoveContainer" containerID="3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76" Feb 20 10:10:17 crc kubenswrapper[4962]: E0220 10:10:17.745878 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76\": container with ID starting with 3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76 not found: ID does not exist" containerID="3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.745950 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76"} err="failed to get container status \"3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76\": rpc error: code = NotFound desc = could not find container \"3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76\": container with ID starting with 3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76 not found: ID does not exist" Feb 20 10:10:18 crc kubenswrapper[4962]: I0220 10:10:18.143869 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05518aab-48c4-4826-89d9-080858755a80" (UID: "05518aab-48c4-4826-89d9-080858755a80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:18 crc kubenswrapper[4962]: I0220 10:10:18.168044 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:18 crc kubenswrapper[4962]: I0220 10:10:18.287653 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:18 crc kubenswrapper[4962]: I0220 10:10:18.291362 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.149709 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05518aab-48c4-4826-89d9-080858755a80" path="/var/lib/kubelet/pods/05518aab-48c4-4826-89d9-080858755a80/volumes" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.617976 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg"] Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618496 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="extract-content" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618508 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="extract-content" Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618519 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618538 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618549 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="extract-utilities" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618556 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="extract-utilities" Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618567 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="extract-utilities" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618572 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="extract-utilities" Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618583 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="extract-content" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618604 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="extract-content" Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618614 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618620 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618859 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618874 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.619723 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.624024 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-jv45q" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.639655 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg"] Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.693013 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdrp7\" (UniqueName: \"kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.693088 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.693231 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.794832 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.794940 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdrp7\" (UniqueName: \"kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.795003 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.795672 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.795904 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.821102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdrp7\" (UniqueName: \"kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.941222 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:20 crc kubenswrapper[4962]: I0220 10:10:20.401218 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg"] Feb 20 10:10:20 crc kubenswrapper[4962]: W0220 10:10:20.407179 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db1f907_b4ac_45b1_9f38_93727dfde270.slice/crio-cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460 WatchSource:0}: Error finding container cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460: Status 404 returned error can't find the container with id cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460 Feb 20 10:10:20 crc kubenswrapper[4962]: I0220 10:10:20.708184 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerStarted","Data":"835f023c2b9927593635cc48239ac460d0bfa52240f057d6beb80de9047703e6"} Feb 20 10:10:20 crc kubenswrapper[4962]: I0220 10:10:20.708235 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerStarted","Data":"cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460"} Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.362672 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.366030 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.373341 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.419631 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.419685 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqb4j\" (UniqueName: \"kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.419721 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.521476 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqb4j\" (UniqueName: \"kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.521524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.521554 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.522066 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.522630 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.558916 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqb4j\" (UniqueName: \"kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.689041 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.715347 4962 generic.go:334] "Generic (PLEG): container finished" podID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerID="835f023c2b9927593635cc48239ac460d0bfa52240f057d6beb80de9047703e6" exitCode=0 Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.715499 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerDied","Data":"835f023c2b9927593635cc48239ac460d0bfa52240f057d6beb80de9047703e6"} Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.231777 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.722327 4962 generic.go:334] "Generic (PLEG): container finished" podID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerID="be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055" exitCode=0 Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.722384 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerDied","Data":"be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055"} Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.723863 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerStarted","Data":"b0bd9dbd43b0d997d93a5c2fdb222f2ed322bda2a2ba98ad81b098c65e32686b"} Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.725658 4962 generic.go:334] "Generic (PLEG): container finished" podID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerID="88dafe47aab03aec8975185374016013491b0dc14ba99fb3f5d221f0618853e5" exitCode=0 Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.725691 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerDied","Data":"88dafe47aab03aec8975185374016013491b0dc14ba99fb3f5d221f0618853e5"} Feb 20 10:10:23 crc kubenswrapper[4962]: I0220 10:10:23.738738 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerStarted","Data":"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536"} Feb 20 10:10:23 crc kubenswrapper[4962]: I0220 10:10:23.746917 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerDied","Data":"304066fbfec48ab67b477f579d2649c382a5c17b23511952ed6af8766db7a80c"} Feb 20 10:10:23 crc kubenswrapper[4962]: I0220 10:10:23.746743 4962 generic.go:334] "Generic (PLEG): container finished" podID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerID="304066fbfec48ab67b477f579d2649c382a5c17b23511952ed6af8766db7a80c" exitCode=0 Feb 20 10:10:24 crc kubenswrapper[4962]: I0220 10:10:24.757784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerDied","Data":"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536"} Feb 20 10:10:24 crc kubenswrapper[4962]: I0220 10:10:24.757674 4962 generic.go:334] "Generic (PLEG): container finished" podID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerID="4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536" exitCode=0 Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.118116 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.178145 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdrp7\" (UniqueName: \"kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7\") pod \"3db1f907-b4ac-45b1-9f38-93727dfde270\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.178189 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util\") pod \"3db1f907-b4ac-45b1-9f38-93727dfde270\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.178254 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle\") pod \"3db1f907-b4ac-45b1-9f38-93727dfde270\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.180387 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle" (OuterVolumeSpecName: "bundle") pod "3db1f907-b4ac-45b1-9f38-93727dfde270" (UID: "3db1f907-b4ac-45b1-9f38-93727dfde270"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.184806 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7" (OuterVolumeSpecName: "kube-api-access-hdrp7") pod "3db1f907-b4ac-45b1-9f38-93727dfde270" (UID: "3db1f907-b4ac-45b1-9f38-93727dfde270"). InnerVolumeSpecName "kube-api-access-hdrp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.191850 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util" (OuterVolumeSpecName: "util") pod "3db1f907-b4ac-45b1-9f38-93727dfde270" (UID: "3db1f907-b4ac-45b1-9f38-93727dfde270"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.279453 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdrp7\" (UniqueName: \"kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.279495 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.279508 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.771527 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerDied","Data":"cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460"} Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.772008 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.771564 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.775118 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerStarted","Data":"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237"} Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.803263 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f74q5" podStartSLOduration=2.3540036349999998 podStartE2EDuration="4.80324475s" podCreationTimestamp="2026-02-20 10:10:21 +0000 UTC" firstStartedPulling="2026-02-20 10:10:22.723736519 +0000 UTC m=+914.306208365" lastFinishedPulling="2026-02-20 10:10:25.172977634 +0000 UTC m=+916.755449480" observedRunningTime="2026-02-20 10:10:25.800736342 +0000 UTC m=+917.383208198" watchObservedRunningTime="2026-02-20 10:10:25.80324475 +0000 UTC m=+917.385716606" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.726849 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2"] Feb 20 10:10:29 crc kubenswrapper[4962]: E0220 10:10:29.727188 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="extract" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.727205 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="extract" Feb 20 10:10:29 crc kubenswrapper[4962]: E0220 10:10:29.727221 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="util" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.727229 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="util" Feb 20 10:10:29 crc kubenswrapper[4962]: E0220 10:10:29.727246 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="pull" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.727255 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="pull" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.727399 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="extract" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.727919 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.732282 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rw5vz" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.764105 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2"] Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.848662 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrdvb\" (UniqueName: \"kubernetes.io/projected/ad363690-9ad6-4f45-ac02-d51ec41d213b-kube-api-access-xrdvb\") pod \"openstack-operator-controller-init-6679bf9b57-n5hm2\" (UID: \"ad363690-9ad6-4f45-ac02-d51ec41d213b\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.950220 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrdvb\" (UniqueName: \"kubernetes.io/projected/ad363690-9ad6-4f45-ac02-d51ec41d213b-kube-api-access-xrdvb\") pod \"openstack-operator-controller-init-6679bf9b57-n5hm2\" (UID: \"ad363690-9ad6-4f45-ac02-d51ec41d213b\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.975549 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrdvb\" (UniqueName: \"kubernetes.io/projected/ad363690-9ad6-4f45-ac02-d51ec41d213b-kube-api-access-xrdvb\") pod \"openstack-operator-controller-init-6679bf9b57-n5hm2\" (UID: \"ad363690-9ad6-4f45-ac02-d51ec41d213b\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:30 crc kubenswrapper[4962]: I0220 10:10:30.044580 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:30 crc kubenswrapper[4962]: I0220 10:10:30.491384 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2"] Feb 20 10:10:30 crc kubenswrapper[4962]: I0220 10:10:30.809857 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" event={"ID":"ad363690-9ad6-4f45-ac02-d51ec41d213b","Type":"ContainerStarted","Data":"f27c2da4df93f2a4e89660a26a4a7dc15f6c65172f7b6c3b247bf33b3636e709"} Feb 20 10:10:31 crc kubenswrapper[4962]: I0220 10:10:31.689636 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:31 crc kubenswrapper[4962]: I0220 10:10:31.689709 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:31 crc kubenswrapper[4962]: I0220 10:10:31.748254 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:31 crc kubenswrapper[4962]: I0220 10:10:31.865740 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:34 crc kubenswrapper[4962]: I0220 10:10:34.353905 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:34 crc kubenswrapper[4962]: I0220 10:10:34.354832 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f74q5" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="registry-server" containerID="cri-o://789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237" gracePeriod=2 Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.422145 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.562892 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities\") pod \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.563004 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content\") pod \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.563080 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqb4j\" (UniqueName: \"kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j\") pod \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.563897 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities" (OuterVolumeSpecName: "utilities") pod "bea00b9c-e00f-4cec-b1bf-9955dd868c9c" (UID: "bea00b9c-e00f-4cec-b1bf-9955dd868c9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.579670 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j" (OuterVolumeSpecName: "kube-api-access-pqb4j") pod "bea00b9c-e00f-4cec-b1bf-9955dd868c9c" (UID: "bea00b9c-e00f-4cec-b1bf-9955dd868c9c"). InnerVolumeSpecName "kube-api-access-pqb4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.620271 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bea00b9c-e00f-4cec-b1bf-9955dd868c9c" (UID: "bea00b9c-e00f-4cec-b1bf-9955dd868c9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.664909 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.664941 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.664951 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqb4j\" (UniqueName: \"kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.862175 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" event={"ID":"ad363690-9ad6-4f45-ac02-d51ec41d213b","Type":"ContainerStarted","Data":"ee0600b1964e6d09850f03e96e548348a6c0d60851d33fdd4e668f460bbd691b"} Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.862578 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.864410 4962 generic.go:334] "Generic (PLEG): container finished" podID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerID="789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237" exitCode=0 Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.864459 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerDied","Data":"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237"} Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.864489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerDied","Data":"b0bd9dbd43b0d997d93a5c2fdb222f2ed322bda2a2ba98ad81b098c65e32686b"} Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.864516 4962 scope.go:117] "RemoveContainer" containerID="789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.864729 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.887043 4962 scope.go:117] "RemoveContainer" containerID="4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.898342 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" podStartSLOduration=2.220170803 podStartE2EDuration="6.898324801s" podCreationTimestamp="2026-02-20 10:10:29 +0000 UTC" firstStartedPulling="2026-02-20 10:10:30.527298062 +0000 UTC m=+922.109769928" lastFinishedPulling="2026-02-20 10:10:35.20545208 +0000 UTC m=+926.787923926" observedRunningTime="2026-02-20 10:10:35.898171527 +0000 UTC m=+927.480643373" watchObservedRunningTime="2026-02-20 10:10:35.898324801 +0000 UTC m=+927.480796647" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.911955 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.916059 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.919761 4962 scope.go:117] "RemoveContainer" containerID="be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.933967 4962 scope.go:117] "RemoveContainer" containerID="789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237" Feb 20 10:10:35 crc kubenswrapper[4962]: E0220 10:10:35.934480 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237\": container with ID starting with 789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237 not found: ID does not exist" containerID="789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.934526 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237"} err="failed to get container status \"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237\": rpc error: code = NotFound desc = could not find container \"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237\": container with ID starting with 789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237 not found: ID does not exist" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.934553 4962 scope.go:117] "RemoveContainer" containerID="4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536" Feb 20 10:10:35 crc kubenswrapper[4962]: E0220 10:10:35.934945 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536\": container with ID starting with 4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536 not found: ID does not exist" containerID="4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.934967 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536"} err="failed to get container status \"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536\": rpc error: code = NotFound desc = could not find container \"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536\": container with ID starting with 4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536 not found: ID does not exist" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.934981 4962 scope.go:117] "RemoveContainer" containerID="be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055" Feb 20 10:10:35 crc kubenswrapper[4962]: E0220 10:10:35.935245 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055\": container with ID starting with be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055 not found: ID does not exist" containerID="be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.935264 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055"} err="failed to get container status \"be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055\": rpc error: code = NotFound desc = could not find container \"be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055\": container with ID starting with be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055 not found: ID does not exist" Feb 20 10:10:37 crc kubenswrapper[4962]: I0220 10:10:37.147966 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" path="/var/lib/kubelet/pods/bea00b9c-e00f-4cec-b1bf-9955dd868c9c/volumes" Feb 20 10:10:40 crc kubenswrapper[4962]: I0220 10:10:40.047327 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.508556 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.509215 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.509318 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.510314 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.510419 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce" gracePeriod=600 Feb 20 10:10:41 crc kubenswrapper[4962]: E0220 10:10:41.684859 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod751d5e0b_919c_4777_8475_ed7214f7647f.slice/crio-conmon-00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce.scope\": RecentStats: unable to find data in memory cache]" Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.904357 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce" exitCode=0 Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.904422 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce"} Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.904692 4962 scope.go:117] "RemoveContainer" containerID="f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14" Feb 20 10:10:42 crc kubenswrapper[4962]: I0220 10:10:42.916300 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1"} Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.309993 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5"] Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.310927 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="registry-server" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.310943 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="registry-server" Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.310952 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="extract-content" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.310959 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="extract-content" Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.310986 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="extract-utilities" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.310992 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="extract-utilities" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.311106 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="registry-server" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.311613 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.314299 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lchp8" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.329826 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.347989 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tdbn\" (UniqueName: \"kubernetes.io/projected/e0560856-ed00-4ea8-8ce7-a801f1d46489-kube-api-access-8tdbn\") pod \"barbican-operator-controller-manager-868647ff47-nhpg5\" (UID: \"e0560856-ed00-4ea8-8ce7-a801f1d46489\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.352663 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.353684 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.356925 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qd5jx" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.357156 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.358274 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.361378 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-d2jgz" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.382691 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.389670 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.390841 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.400305 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-ldhhr" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.423355 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.424412 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.428246 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lhcl7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.438659 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.450109 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr5ts\" (UniqueName: \"kubernetes.io/projected/ea986843-26e4-4410-a65e-ae51c02dc04c-kube-api-access-sr5ts\") pod \"glance-operator-controller-manager-77987464f4-wcqzf\" (UID: \"ea986843-26e4-4410-a65e-ae51c02dc04c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.450167 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxzwg\" (UniqueName: \"kubernetes.io/projected/ac33f7ed-c3f8-487d-89dc-4a614d357b86-kube-api-access-xxzwg\") pod \"cinder-operator-controller-manager-5d946d989d-bsq9n\" (UID: \"ac33f7ed-c3f8-487d-89dc-4a614d357b86\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.450217 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tdbn\" (UniqueName: \"kubernetes.io/projected/e0560856-ed00-4ea8-8ce7-a801f1d46489-kube-api-access-8tdbn\") pod \"barbican-operator-controller-manager-868647ff47-nhpg5\" (UID: \"e0560856-ed00-4ea8-8ce7-a801f1d46489\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.450258 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5qxh\" (UniqueName: \"kubernetes.io/projected/fee6970c-0ad7-46ea-ab75-dcb7d552ffbb-kube-api-access-s5qxh\") pod \"heat-operator-controller-manager-69f49c598c-75vx4\" (UID: \"fee6970c-0ad7-46ea-ab75-dcb7d552ffbb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.450284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h72m9\" (UniqueName: \"kubernetes.io/projected/cf0e10ba-c175-44c3-9011-6646f21ba334-kube-api-access-h72m9\") pod \"designate-operator-controller-manager-6d8bf5c495-r2t72\" (UID: \"cf0e10ba-c175-44c3-9011-6646f21ba334\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.461063 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.470816 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.482950 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.483970 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.489855 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.490610 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.499787 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-fprm6" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.500051 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.500222 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-r9l78" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.500254 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tdbn\" (UniqueName: \"kubernetes.io/projected/e0560856-ed00-4ea8-8ce7-a801f1d46489-kube-api-access-8tdbn\") pod \"barbican-operator-controller-manager-868647ff47-nhpg5\" (UID: \"e0560856-ed00-4ea8-8ce7-a801f1d46489\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.504684 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.505799 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.507989 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rcb6k" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.512087 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.518655 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.523298 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.531663 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.534335 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.538626 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.542125 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-65srs" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551319 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551384 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dmxc\" (UniqueName: \"kubernetes.io/projected/12f33757-f329-47a6-9273-bdeb1558a4d7-kube-api-access-5dmxc\") pod \"horizon-operator-controller-manager-5b9b8895d5-rhhc7\" (UID: \"12f33757-f329-47a6-9273-bdeb1558a4d7\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd9qj\" (UniqueName: \"kubernetes.io/projected/5fec06f1-8ccf-403c-88de-2b581f056802-kube-api-access-kd9qj\") pod \"ironic-operator-controller-manager-554564d7fc-2hg4n\" (UID: \"5fec06f1-8ccf-403c-88de-2b581f056802\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5qxh\" (UniqueName: \"kubernetes.io/projected/fee6970c-0ad7-46ea-ab75-dcb7d552ffbb-kube-api-access-s5qxh\") pod \"heat-operator-controller-manager-69f49c598c-75vx4\" (UID: \"fee6970c-0ad7-46ea-ab75-dcb7d552ffbb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551456 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h72m9\" (UniqueName: \"kubernetes.io/projected/cf0e10ba-c175-44c3-9011-6646f21ba334-kube-api-access-h72m9\") pod \"designate-operator-controller-manager-6d8bf5c495-r2t72\" (UID: \"cf0e10ba-c175-44c3-9011-6646f21ba334\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551494 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw8bc\" (UniqueName: \"kubernetes.io/projected/7afb870a-75a4-42d5-9704-5cef14dd3ce9-kube-api-access-rw8bc\") pod \"keystone-operator-controller-manager-b4d948c87-jjbwt\" (UID: \"7afb870a-75a4-42d5-9704-5cef14dd3ce9\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551529 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c85cx\" (UniqueName: \"kubernetes.io/projected/0c8c62e9-0201-43a4-b823-82af87a0977e-kube-api-access-c85cx\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551560 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr5ts\" (UniqueName: \"kubernetes.io/projected/ea986843-26e4-4410-a65e-ae51c02dc04c-kube-api-access-sr5ts\") pod \"glance-operator-controller-manager-77987464f4-wcqzf\" (UID: \"ea986843-26e4-4410-a65e-ae51c02dc04c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551587 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxzwg\" (UniqueName: \"kubernetes.io/projected/ac33f7ed-c3f8-487d-89dc-4a614d357b86-kube-api-access-xxzwg\") pod \"cinder-operator-controller-manager-5d946d989d-bsq9n\" (UID: \"ac33f7ed-c3f8-487d-89dc-4a614d357b86\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.558196 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.559458 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.562991 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-q9kkm" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.570332 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.578054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxzwg\" (UniqueName: \"kubernetes.io/projected/ac33f7ed-c3f8-487d-89dc-4a614d357b86-kube-api-access-xxzwg\") pod \"cinder-operator-controller-manager-5d946d989d-bsq9n\" (UID: \"ac33f7ed-c3f8-487d-89dc-4a614d357b86\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.584108 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5qxh\" (UniqueName: \"kubernetes.io/projected/fee6970c-0ad7-46ea-ab75-dcb7d552ffbb-kube-api-access-s5qxh\") pod \"heat-operator-controller-manager-69f49c598c-75vx4\" (UID: \"fee6970c-0ad7-46ea-ab75-dcb7d552ffbb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.590973 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.592805 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.597488 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-s97hl" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.599204 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr5ts\" (UniqueName: \"kubernetes.io/projected/ea986843-26e4-4410-a65e-ae51c02dc04c-kube-api-access-sr5ts\") pod \"glance-operator-controller-manager-77987464f4-wcqzf\" (UID: \"ea986843-26e4-4410-a65e-ae51c02dc04c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.606357 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h72m9\" (UniqueName: \"kubernetes.io/projected/cf0e10ba-c175-44c3-9011-6646f21ba334-kube-api-access-h72m9\") pod \"designate-operator-controller-manager-6d8bf5c495-r2t72\" (UID: \"cf0e10ba-c175-44c3-9011-6646f21ba334\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.618170 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.634467 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.644766 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.647799 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.659529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.659647 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dmxc\" (UniqueName: \"kubernetes.io/projected/12f33757-f329-47a6-9273-bdeb1558a4d7-kube-api-access-5dmxc\") pod \"horizon-operator-controller-manager-5b9b8895d5-rhhc7\" (UID: \"12f33757-f329-47a6-9273-bdeb1558a4d7\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.660041 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jds7h" Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.660393 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.660455 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:12.160432457 +0000 UTC m=+963.742904303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.668453 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd9qj\" (UniqueName: \"kubernetes.io/projected/5fec06f1-8ccf-403c-88de-2b581f056802-kube-api-access-kd9qj\") pod \"ironic-operator-controller-manager-554564d7fc-2hg4n\" (UID: \"5fec06f1-8ccf-403c-88de-2b581f056802\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.668656 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw8bc\" (UniqueName: \"kubernetes.io/projected/7afb870a-75a4-42d5-9704-5cef14dd3ce9-kube-api-access-rw8bc\") pod \"keystone-operator-controller-manager-b4d948c87-jjbwt\" (UID: \"7afb870a-75a4-42d5-9704-5cef14dd3ce9\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.677837 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c85cx\" (UniqueName: \"kubernetes.io/projected/0c8c62e9-0201-43a4-b823-82af87a0977e-kube-api-access-c85cx\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.704446 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd9qj\" (UniqueName: \"kubernetes.io/projected/5fec06f1-8ccf-403c-88de-2b581f056802-kube-api-access-kd9qj\") pod \"ironic-operator-controller-manager-554564d7fc-2hg4n\" (UID: \"5fec06f1-8ccf-403c-88de-2b581f056802\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.707776 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.708565 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.711364 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dmxc\" (UniqueName: \"kubernetes.io/projected/12f33757-f329-47a6-9273-bdeb1558a4d7-kube-api-access-5dmxc\") pod \"horizon-operator-controller-manager-5b9b8895d5-rhhc7\" (UID: \"12f33757-f329-47a6-9273-bdeb1558a4d7\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.716290 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw8bc\" (UniqueName: \"kubernetes.io/projected/7afb870a-75a4-42d5-9704-5cef14dd3ce9-kube-api-access-rw8bc\") pod \"keystone-operator-controller-manager-b4d948c87-jjbwt\" (UID: \"7afb870a-75a4-42d5-9704-5cef14dd3ce9\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.717672 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.728946 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.756515 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.765738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c85cx\" (UniqueName: \"kubernetes.io/projected/0c8c62e9-0201-43a4-b823-82af87a0977e-kube-api-access-c85cx\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.769242 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.777302 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.777458 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.778810 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.779662 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s6mt\" (UniqueName: \"kubernetes.io/projected/a9979be5-6650-425b-a748-51e2cb552413-kube-api-access-4s6mt\") pod \"manila-operator-controller-manager-54f6768c69-6lvhz\" (UID: \"a9979be5-6650-425b-a748-51e2cb552413\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.779792 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzq8g\" (UniqueName: \"kubernetes.io/projected/6fdeab3e-de35-4d69-9e67-e5d8257bc25d-kube-api-access-rzq8g\") pod \"neutron-operator-controller-manager-64ddbf8bb-knwp9\" (UID: \"6fdeab3e-de35-4d69-9e67-e5d8257bc25d\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.779895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smwnb\" (UniqueName: \"kubernetes.io/projected/f8f1dca9-8b83-469d-b834-3f11376576c9-kube-api-access-smwnb\") pod \"mariadb-operator-controller-manager-6994f66f48-wn92v\" (UID: \"f8f1dca9-8b83-469d-b834-3f11376576c9\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.787158 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-5xxdq" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.787411 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-knql5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.788079 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.797437 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.810434 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.811554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.817883 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-gb5fm" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.818994 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.822189 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.825471 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-khxcn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.827331 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.828333 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.834064 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.844015 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8d99h" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.845982 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.876674 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883646 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s6mt\" (UniqueName: \"kubernetes.io/projected/a9979be5-6650-425b-a748-51e2cb552413-kube-api-access-4s6mt\") pod \"manila-operator-controller-manager-54f6768c69-6lvhz\" (UID: \"a9979be5-6650-425b-a748-51e2cb552413\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883724 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfbq\" (UniqueName: \"kubernetes.io/projected/34cb38e0-7c0a-4f00-89e9-9be7b394585d-kube-api-access-wlfbq\") pod \"placement-operator-controller-manager-8497b45c89-x4gh4\" (UID: \"34cb38e0-7c0a-4f00-89e9-9be7b394585d\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883772 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cn9n\" (UniqueName: \"kubernetes.io/projected/72728d52-a8e9-4689-8da0-871f250f7664-kube-api-access-8cn9n\") pod \"ovn-operator-controller-manager-d44cf6b75-nlq5k\" (UID: \"72728d52-a8e9-4689-8da0-871f250f7664\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883792 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzq8g\" (UniqueName: \"kubernetes.io/projected/6fdeab3e-de35-4d69-9e67-e5d8257bc25d-kube-api-access-rzq8g\") pod \"neutron-operator-controller-manager-64ddbf8bb-knwp9\" (UID: \"6fdeab3e-de35-4d69-9e67-e5d8257bc25d\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883811 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7dkp\" (UniqueName: \"kubernetes.io/projected/f8de466d-f069-4a8e-8598-72a163525c24-kube-api-access-x7dkp\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883841 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smwnb\" (UniqueName: \"kubernetes.io/projected/f8f1dca9-8b83-469d-b834-3f11376576c9-kube-api-access-smwnb\") pod \"mariadb-operator-controller-manager-6994f66f48-wn92v\" (UID: \"f8f1dca9-8b83-469d-b834-3f11376576c9\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883861 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g57r5\" (UniqueName: \"kubernetes.io/projected/14efe385-5147-49ed-a42f-804b91438a55-kube-api-access-g57r5\") pod \"octavia-operator-controller-manager-69f8888797-ln4sp\" (UID: \"14efe385-5147-49ed-a42f-804b91438a55\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.884802 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5t22\" (UniqueName: \"kubernetes.io/projected/4e2614ed-ea7a-430e-af7b-4d66f05f7b96-kube-api-access-c5t22\") pod \"nova-operator-controller-manager-567668f5cf-d2clq\" (UID: \"4e2614ed-ea7a-430e-af7b-4d66f05f7b96\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.885388 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.885607 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.911009 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.914243 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-qn79d" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.924017 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.930768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s6mt\" (UniqueName: \"kubernetes.io/projected/a9979be5-6650-425b-a748-51e2cb552413-kube-api-access-4s6mt\") pod \"manila-operator-controller-manager-54f6768c69-6lvhz\" (UID: \"a9979be5-6650-425b-a748-51e2cb552413\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.930796 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smwnb\" (UniqueName: \"kubernetes.io/projected/f8f1dca9-8b83-469d-b834-3f11376576c9-kube-api-access-smwnb\") pod \"mariadb-operator-controller-manager-6994f66f48-wn92v\" (UID: \"f8f1dca9-8b83-469d-b834-3f11376576c9\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.938312 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzq8g\" (UniqueName: \"kubernetes.io/projected/6fdeab3e-de35-4d69-9e67-e5d8257bc25d-kube-api-access-rzq8g\") pod \"neutron-operator-controller-manager-64ddbf8bb-knwp9\" (UID: \"6fdeab3e-de35-4d69-9e67-e5d8257bc25d\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.941114 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.978249 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.985655 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986696 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jsp6\" (UniqueName: \"kubernetes.io/projected/4a325f02-ddda-49e9-9ef0-40fd4726b09f-kube-api-access-7jsp6\") pod \"swift-operator-controller-manager-68f46476f-9pxbg\" (UID: \"4a325f02-ddda-49e9-9ef0-40fd4726b09f\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986736 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g57r5\" (UniqueName: \"kubernetes.io/projected/14efe385-5147-49ed-a42f-804b91438a55-kube-api-access-g57r5\") pod \"octavia-operator-controller-manager-69f8888797-ln4sp\" (UID: \"14efe385-5147-49ed-a42f-804b91438a55\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986777 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986802 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5t22\" (UniqueName: \"kubernetes.io/projected/4e2614ed-ea7a-430e-af7b-4d66f05f7b96-kube-api-access-c5t22\") pod \"nova-operator-controller-manager-567668f5cf-d2clq\" (UID: \"4e2614ed-ea7a-430e-af7b-4d66f05f7b96\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986853 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfbq\" (UniqueName: \"kubernetes.io/projected/34cb38e0-7c0a-4f00-89e9-9be7b394585d-kube-api-access-wlfbq\") pod \"placement-operator-controller-manager-8497b45c89-x4gh4\" (UID: \"34cb38e0-7c0a-4f00-89e9-9be7b394585d\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986901 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cn9n\" (UniqueName: \"kubernetes.io/projected/72728d52-a8e9-4689-8da0-871f250f7664-kube-api-access-8cn9n\") pod \"ovn-operator-controller-manager-d44cf6b75-nlq5k\" (UID: \"72728d52-a8e9-4689-8da0-871f250f7664\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986944 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7dkp\" (UniqueName: \"kubernetes.io/projected/f8de466d-f069-4a8e-8598-72a163525c24-kube-api-access-x7dkp\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.987235 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.987295 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:12.487279909 +0000 UTC m=+964.069751755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.000798 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.002112 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.005696 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-dck5b" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.006969 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.012270 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfbq\" (UniqueName: \"kubernetes.io/projected/34cb38e0-7c0a-4f00-89e9-9be7b394585d-kube-api-access-wlfbq\") pod \"placement-operator-controller-manager-8497b45c89-x4gh4\" (UID: \"34cb38e0-7c0a-4f00-89e9-9be7b394585d\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.020555 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5t22\" (UniqueName: \"kubernetes.io/projected/4e2614ed-ea7a-430e-af7b-4d66f05f7b96-kube-api-access-c5t22\") pod \"nova-operator-controller-manager-567668f5cf-d2clq\" (UID: \"4e2614ed-ea7a-430e-af7b-4d66f05f7b96\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.026102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g57r5\" (UniqueName: \"kubernetes.io/projected/14efe385-5147-49ed-a42f-804b91438a55-kube-api-access-g57r5\") pod \"octavia-operator-controller-manager-69f8888797-ln4sp\" (UID: \"14efe385-5147-49ed-a42f-804b91438a55\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.026796 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7dkp\" (UniqueName: \"kubernetes.io/projected/f8de466d-f069-4a8e-8598-72a163525c24-kube-api-access-x7dkp\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.026887 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-lxl4x"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.028747 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.034700 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-kszqw" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.037722 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-lxl4x"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.060937 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cn9n\" (UniqueName: \"kubernetes.io/projected/72728d52-a8e9-4689-8da0-871f250f7664-kube-api-access-8cn9n\") pod \"ovn-operator-controller-manager-d44cf6b75-nlq5k\" (UID: \"72728d52-a8e9-4689-8da0-871f250f7664\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.073650 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.074459 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.087951 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jsp6\" (UniqueName: \"kubernetes.io/projected/4a325f02-ddda-49e9-9ef0-40fd4726b09f-kube-api-access-7jsp6\") pod \"swift-operator-controller-manager-68f46476f-9pxbg\" (UID: \"4a325f02-ddda-49e9-9ef0-40fd4726b09f\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.088122 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p72nh\" (UniqueName: \"kubernetes.io/projected/32d42cbd-4ea1-49cc-b9d4-33fe5f655a16-kube-api-access-p72nh\") pod \"test-operator-controller-manager-7866795846-lxl4x\" (UID: \"32d42cbd-4ea1-49cc-b9d4-33fe5f655a16\") " pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.088297 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpn7z\" (UniqueName: \"kubernetes.io/projected/7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084-kube-api-access-bpn7z\") pod \"telemetry-operator-controller-manager-7f45b4ff68-mfpm9\" (UID: \"7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.097795 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.113865 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.114388 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jsp6\" (UniqueName: \"kubernetes.io/projected/4a325f02-ddda-49e9-9ef0-40fd4726b09f-kube-api-access-7jsp6\") pod \"swift-operator-controller-manager-68f46476f-9pxbg\" (UID: \"4a325f02-ddda-49e9-9ef0-40fd4726b09f\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.119155 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.125359 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.131730 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5rxms" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.145923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.157179 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.167451 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.195709 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.199382 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpn7z\" (UniqueName: \"kubernetes.io/projected/7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084-kube-api-access-bpn7z\") pod \"telemetry-operator-controller-manager-7f45b4ff68-mfpm9\" (UID: \"7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.199541 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.199721 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjxdt\" (UniqueName: \"kubernetes.io/projected/4c8bff11-1a85-4f9b-8fb2-defd04ac22d1-kube-api-access-bjxdt\") pod \"watcher-operator-controller-manager-5db88f68c-kthxs\" (UID: \"4c8bff11-1a85-4f9b-8fb2-defd04ac22d1\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.199780 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p72nh\" (UniqueName: \"kubernetes.io/projected/32d42cbd-4ea1-49cc-b9d4-33fe5f655a16-kube-api-access-p72nh\") pod \"test-operator-controller-manager-7866795846-lxl4x\" (UID: \"32d42cbd-4ea1-49cc-b9d4-33fe5f655a16\") " pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.200228 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.200354 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:13.200315598 +0000 UTC m=+964.782787444 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.210576 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.214039 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.220318 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p72nh\" (UniqueName: \"kubernetes.io/projected/32d42cbd-4ea1-49cc-b9d4-33fe5f655a16-kube-api-access-p72nh\") pod \"test-operator-controller-manager-7866795846-lxl4x\" (UID: \"32d42cbd-4ea1-49cc-b9d4-33fe5f655a16\") " pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.221373 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.221434 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.222246 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nl4z6" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.227795 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpn7z\" (UniqueName: \"kubernetes.io/projected/7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084-kube-api-access-bpn7z\") pod \"telemetry-operator-controller-manager-7f45b4ff68-mfpm9\" (UID: \"7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.235998 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.273572 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.274652 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.274808 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.289574 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-785xk" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.290144 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.306385 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjxdt\" (UniqueName: \"kubernetes.io/projected/4c8bff11-1a85-4f9b-8fb2-defd04ac22d1-kube-api-access-bjxdt\") pod \"watcher-operator-controller-manager-5db88f68c-kthxs\" (UID: \"4c8bff11-1a85-4f9b-8fb2-defd04ac22d1\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.306512 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.306564 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgfhj\" (UniqueName: \"kubernetes.io/projected/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-kube-api-access-qgfhj\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.306775 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.318554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.337821 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjxdt\" (UniqueName: \"kubernetes.io/projected/4c8bff11-1a85-4f9b-8fb2-defd04ac22d1-kube-api-access-bjxdt\") pod \"watcher-operator-controller-manager-5db88f68c-kthxs\" (UID: \"4c8bff11-1a85-4f9b-8fb2-defd04ac22d1\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.338219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.363742 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.401350 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.408110 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g586b\" (UniqueName: \"kubernetes.io/projected/5691d6ef-dedb-4a46-a1b6-0435e9f6db0a-kube-api-access-g586b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5mrjv\" (UID: \"5691d6ef-dedb-4a46-a1b6-0435e9f6db0a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.408186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.408212 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgfhj\" (UniqueName: \"kubernetes.io/projected/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-kube-api-access-qgfhj\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.408267 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.410754 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.410841 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:12.910819798 +0000 UTC m=+964.493291644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.410921 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.410947 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:12.910940662 +0000 UTC m=+964.493412508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.433463 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgfhj\" (UniqueName: \"kubernetes.io/projected/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-kube-api-access-qgfhj\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.510053 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g586b\" (UniqueName: \"kubernetes.io/projected/5691d6ef-dedb-4a46-a1b6-0435e9f6db0a-kube-api-access-g586b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5mrjv\" (UID: \"5691d6ef-dedb-4a46-a1b6-0435e9f6db0a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.510112 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.511303 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.511353 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:13.511334294 +0000 UTC m=+965.093806150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.534580 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g586b\" (UniqueName: \"kubernetes.io/projected/5691d6ef-dedb-4a46-a1b6-0435e9f6db0a-kube-api-access-g586b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5mrjv\" (UID: \"5691d6ef-dedb-4a46-a1b6-0435e9f6db0a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.691995 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.921765 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.921864 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.922047 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.922124 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:13.922097383 +0000 UTC m=+965.504569229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.922677 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.922710 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:13.922702162 +0000 UTC m=+965.505174008 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.930886 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" event={"ID":"e0560856-ed00-4ea8-8ce7-a801f1d46489","Type":"ContainerStarted","Data":"de2121809761692983b3cd6a346cd8b6ff6b8bfe36194b1bba368bf6b4f129b6"} Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.938988 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.234196 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.234450 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.234568 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:15.234543174 +0000 UTC m=+966.817015020 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.416433 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.422053 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4"] Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.438493 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee6970c_0ad7_46ea_ab75_dcb7d552ffbb.slice/crio-a198d009dc06fd2574b0341274512a78ce0a49b9ab0f9ecd050e235fde8d052a WatchSource:0}: Error finding container a198d009dc06fd2574b0341274512a78ce0a49b9ab0f9ecd050e235fde8d052a: Status 404 returned error can't find the container with id a198d009dc06fd2574b0341274512a78ce0a49b9ab0f9ecd050e235fde8d052a Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.442492 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.452404 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.461673 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.470196 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7"] Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.471110 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e2614ed_ea7a_430e_af7b_4d66f05f7b96.slice/crio-34a0ddda37deba07bfa7c2eadf86ddb51fdf0e7845e15d4a1f3606dc9e698f97 WatchSource:0}: Error finding container 34a0ddda37deba07bfa7c2eadf86ddb51fdf0e7845e15d4a1f3606dc9e698f97: Status 404 returned error can't find the container with id 34a0ddda37deba07bfa7c2eadf86ddb51fdf0e7845e15d4a1f3606dc9e698f97 Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.472952 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12f33757_f329_47a6_9273_bdeb1558a4d7.slice/crio-fadd321de794e991e652070b984f3e3492dc4841091abdab10691e3fb7b80e21 WatchSource:0}: Error finding container fadd321de794e991e652070b984f3e3492dc4841091abdab10691e3fb7b80e21: Status 404 returned error can't find the container with id fadd321de794e991e652070b984f3e3492dc4841091abdab10691e3fb7b80e21 Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.543069 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.543939 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.544051 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:15.544019394 +0000 UTC m=+967.126491240 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.619560 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.629911 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz"] Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.658009 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34cb38e0_7c0a_4f00_89e9_9be7b394585d.slice/crio-06d9c97aac8445abca447e2451adb01f4d52ca1060ccd8903d705d5a5b10a597 WatchSource:0}: Error finding container 06d9c97aac8445abca447e2451adb01f4d52ca1060ccd8903d705d5a5b10a597: Status 404 returned error can't find the container with id 06d9c97aac8445abca447e2451adb01f4d52ca1060ccd8903d705d5a5b10a597 Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.661957 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9979be5_6650_425b_a748_51e2cb552413.slice/crio-820f77e77b7c7eaaf13d56d1af4e62815ea5b2f8611fa0b67213ae0ef2f50252 WatchSource:0}: Error finding container 820f77e77b7c7eaaf13d56d1af4e62815ea5b2f8611fa0b67213ae0ef2f50252: Status 404 returned error can't find the container with id 820f77e77b7c7eaaf13d56d1af4e62815ea5b2f8611fa0b67213ae0ef2f50252 Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.826250 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.842638 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.847146 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt"] Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.857761 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f1dca9_8b83_469d_b834_3f11376576c9.slice/crio-aae002b2366c0e5cb3f0062fc257979c206da0e6321811b91a215ffa06672dd2 WatchSource:0}: Error finding container aae002b2366c0e5cb3f0062fc257979c206da0e6321811b91a215ffa06672dd2: Status 404 returned error can't find the container with id aae002b2366c0e5cb3f0062fc257979c206da0e6321811b91a215ffa06672dd2 Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.953914 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" event={"ID":"cf0e10ba-c175-44c3-9011-6646f21ba334","Type":"ContainerStarted","Data":"6234a297a899024702f26c5ce04c827d4d29f60d147c6493842e9dfd573eec3d"} Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.959568 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.959804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.959885 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" event={"ID":"ac33f7ed-c3f8-487d-89dc-4a614d357b86","Type":"ContainerStarted","Data":"f6b8e8d13e2962c4b74341b6b7c64c5cabeb00aa2859bb8701a3cfd8464d79d5"} Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.960066 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.960139 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:15.960119736 +0000 UTC m=+967.542591572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.960151 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.960210 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:15.960190078 +0000 UTC m=+967.542661924 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.963091 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" event={"ID":"ea986843-26e4-4410-a65e-ae51c02dc04c","Type":"ContainerStarted","Data":"b0b1de95d5af32b9a4d58731192b6b17db0019a768e764e75f0de147ab865dc4"} Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.965291 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" event={"ID":"5fec06f1-8ccf-403c-88de-2b581f056802","Type":"ContainerStarted","Data":"67d77cc22a959c91c108f2f6a7c834e9424ac410177673a61e232f10a3a001b2"} Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.970031 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" event={"ID":"f8f1dca9-8b83-469d-b834-3f11376576c9","Type":"ContainerStarted","Data":"aae002b2366c0e5cb3f0062fc257979c206da0e6321811b91a215ffa06672dd2"} Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.993934 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" event={"ID":"7afb870a-75a4-42d5-9704-5cef14dd3ce9","Type":"ContainerStarted","Data":"d4d22f66bafe4ea03724a1734229c454926f1f805e14f6c6b482418891ded9f8"} Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.000523 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" event={"ID":"4e2614ed-ea7a-430e-af7b-4d66f05f7b96","Type":"ContainerStarted","Data":"34a0ddda37deba07bfa7c2eadf86ddb51fdf0e7845e15d4a1f3606dc9e698f97"} Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.001641 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" event={"ID":"34cb38e0-7c0a-4f00-89e9-9be7b394585d","Type":"ContainerStarted","Data":"06d9c97aac8445abca447e2451adb01f4d52ca1060ccd8903d705d5a5b10a597"} Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.003322 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" event={"ID":"a9979be5-6650-425b-a748-51e2cb552413","Type":"ContainerStarted","Data":"820f77e77b7c7eaaf13d56d1af4e62815ea5b2f8611fa0b67213ae0ef2f50252"} Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.026130 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv"] Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.035925 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp"] Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.038781 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" event={"ID":"12f33757-f329-47a6-9273-bdeb1558a4d7","Type":"ContainerStarted","Data":"fadd321de794e991e652070b984f3e3492dc4841091abdab10691e3fb7b80e21"} Feb 20 10:11:14 crc kubenswrapper[4962]: W0220 10:11:14.044385 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d077bc6_8a1e_426a_9b2d_8e6b2a5eb084.slice/crio-f4efa0c9ab050b7e1f9103c61606c2e72de3fc0893312327f09c91267efd3b79 WatchSource:0}: Error finding container f4efa0c9ab050b7e1f9103c61606c2e72de3fc0893312327f09c91267efd3b79: Status 404 returned error can't find the container with id f4efa0c9ab050b7e1f9103c61606c2e72de3fc0893312327f09c91267efd3b79 Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.051647 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" event={"ID":"6fdeab3e-de35-4d69-9e67-e5d8257bc25d","Type":"ContainerStarted","Data":"6652b8540b54ec3d0190a09e12606a00ef4a19eb8a27c4440400287cf40c8aeb"} Feb 20 10:11:14 crc kubenswrapper[4962]: W0220 10:11:14.054646 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72728d52_a8e9_4689_8da0_871f250f7664.slice/crio-f3eb52c2dde70e86b51a7691452a3f33e1785c97aa6f9a48a63171874655361d WatchSource:0}: Error finding container f3eb52c2dde70e86b51a7691452a3f33e1785c97aa6f9a48a63171874655361d: Status 404 returned error can't find the container with id f3eb52c2dde70e86b51a7691452a3f33e1785c97aa6f9a48a63171874655361d Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.054782 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9"] Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.057491 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" event={"ID":"fee6970c-0ad7-46ea-ab75-dcb7d552ffbb","Type":"ContainerStarted","Data":"a198d009dc06fd2574b0341274512a78ce0a49b9ab0f9ecd050e235fde8d052a"} Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.062428 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g57r5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-ln4sp_openstack-operators(14efe385-5147-49ed-a42f-804b91438a55): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.062581 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k"] Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.063549 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" podUID="14efe385-5147-49ed-a42f-804b91438a55" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.065790 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jsp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-9pxbg_openstack-operators(4a325f02-ddda-49e9-9ef0-40fd4726b09f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.065969 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g586b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-5mrjv_openstack-operators(5691d6ef-dedb-4a46-a1b6-0435e9f6db0a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.067110 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" podUID="5691d6ef-dedb-4a46-a1b6-0435e9f6db0a" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.067164 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" podUID="4a325f02-ddda-49e9-9ef0-40fd4726b09f" Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.071158 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs"] Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.075739 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p72nh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-lxl4x_openstack-operators(32d42cbd-4ea1-49cc-b9d4-33fe5f655a16): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.077495 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" podUID="32d42cbd-4ea1-49cc-b9d4-33fe5f655a16" Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.078360 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-lxl4x"] Feb 20 10:11:14 crc kubenswrapper[4962]: W0220 10:11:14.082074 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8bff11_1a85_4f9b_8fb2_defd04ac22d1.slice/crio-dbec4d7343fe353d6b3092d5154c5793b1670d4c354758151623388b677647ca WatchSource:0}: Error finding container dbec4d7343fe353d6b3092d5154c5793b1670d4c354758151623388b677647ca: Status 404 returned error can't find the container with id dbec4d7343fe353d6b3092d5154c5793b1670d4c354758151623388b677647ca Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.085512 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bjxdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-kthxs_openstack-operators(4c8bff11-1a85-4f9b-8fb2-defd04ac22d1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.086745 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" podUID="4c8bff11-1a85-4f9b-8fb2-defd04ac22d1" Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.093255 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg"] Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.068715 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" event={"ID":"7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084","Type":"ContainerStarted","Data":"f4efa0c9ab050b7e1f9103c61606c2e72de3fc0893312327f09c91267efd3b79"} Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.071757 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" event={"ID":"72728d52-a8e9-4689-8da0-871f250f7664","Type":"ContainerStarted","Data":"f3eb52c2dde70e86b51a7691452a3f33e1785c97aa6f9a48a63171874655361d"} Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.074957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" event={"ID":"14efe385-5147-49ed-a42f-804b91438a55","Type":"ContainerStarted","Data":"78af8ed74838ca01a42875f1e67f992fae04fded77b2a1aa0226f965681dca04"} Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.081078 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" event={"ID":"5691d6ef-dedb-4a46-a1b6-0435e9f6db0a","Type":"ContainerStarted","Data":"40dd1a45cd28ca81348e73822285443e43967275c254cf1ffeeb5d0fb1350c2b"} Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.082342 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" podUID="14efe385-5147-49ed-a42f-804b91438a55" Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.083150 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" podUID="5691d6ef-dedb-4a46-a1b6-0435e9f6db0a" Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.148024 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" podUID="4a325f02-ddda-49e9-9ef0-40fd4726b09f" Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.149099 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" event={"ID":"4a325f02-ddda-49e9-9ef0-40fd4726b09f","Type":"ContainerStarted","Data":"cbee5d17816c8b4621117d17f860f5fa123589b5fbd03ea826686f8ebd2a55a4"} Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.149246 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" event={"ID":"32d42cbd-4ea1-49cc-b9d4-33fe5f655a16","Type":"ContainerStarted","Data":"b976381d23336d023db12c345f3987af523011b25e1cd393ab8dcdad4bb365dc"} Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.150637 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" podUID="32d42cbd-4ea1-49cc-b9d4-33fe5f655a16" Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.150864 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" event={"ID":"4c8bff11-1a85-4f9b-8fb2-defd04ac22d1","Type":"ContainerStarted","Data":"dbec4d7343fe353d6b3092d5154c5793b1670d4c354758151623388b677647ca"} Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.156971 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" podUID="4c8bff11-1a85-4f9b-8fb2-defd04ac22d1" Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.285517 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.285758 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.285849 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:19.285824401 +0000 UTC m=+970.868296247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.591031 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.591305 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.591423 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:19.591392742 +0000 UTC m=+971.173864588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:16 crc kubenswrapper[4962]: I0220 10:11:16.005051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.005297 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:16 crc kubenswrapper[4962]: I0220 10:11:16.005670 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.005795 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:20.00573064 +0000 UTC m=+971.588202486 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.005904 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.006312 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:20.006282977 +0000 UTC m=+971.588754823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.166174 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" podUID="32d42cbd-4ea1-49cc-b9d4-33fe5f655a16" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.166994 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" podUID="5691d6ef-dedb-4a46-a1b6-0435e9f6db0a" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.167042 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" podUID="14efe385-5147-49ed-a42f-804b91438a55" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.167096 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" podUID="4c8bff11-1a85-4f9b-8fb2-defd04ac22d1" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.167358 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" podUID="4a325f02-ddda-49e9-9ef0-40fd4726b09f" Feb 20 10:11:17 crc kubenswrapper[4962]: E0220 10:11:17.171945 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" podUID="4a325f02-ddda-49e9-9ef0-40fd4726b09f" Feb 20 10:11:19 crc kubenswrapper[4962]: I0220 10:11:19.369248 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:19 crc kubenswrapper[4962]: E0220 10:11:19.369455 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:19 crc kubenswrapper[4962]: E0220 10:11:19.369979 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:27.369952614 +0000 UTC m=+978.952424470 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:19 crc kubenswrapper[4962]: I0220 10:11:19.674024 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:19 crc kubenswrapper[4962]: E0220 10:11:19.674300 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:19 crc kubenswrapper[4962]: E0220 10:11:19.674418 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:27.674388839 +0000 UTC m=+979.256860685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:20 crc kubenswrapper[4962]: I0220 10:11:20.080706 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:20 crc kubenswrapper[4962]: I0220 10:11:20.080812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:20 crc kubenswrapper[4962]: E0220 10:11:20.081041 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:20 crc kubenswrapper[4962]: E0220 10:11:20.081110 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:28.081087724 +0000 UTC m=+979.663559570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:20 crc kubenswrapper[4962]: E0220 10:11:20.081421 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:20 crc kubenswrapper[4962]: E0220 10:11:20.081547 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:28.081513996 +0000 UTC m=+979.663985842 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:26 crc kubenswrapper[4962]: E0220 10:11:26.603354 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 20 10:11:26 crc kubenswrapper[4962]: E0220 10:11:26.604318 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c5t22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-d2clq_openstack-operators(4e2614ed-ea7a-430e-af7b-4d66f05f7b96): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:11:26 crc kubenswrapper[4962]: E0220 10:11:26.605654 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" podUID="4e2614ed-ea7a-430e-af7b-4d66f05f7b96" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.249203 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" event={"ID":"e0560856-ed00-4ea8-8ce7-a801f1d46489","Type":"ContainerStarted","Data":"9587b4cc915bd223ced4ad688e359bc596394dd68782d13782774cb8edc40f9d"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.250080 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.252241 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" event={"ID":"ac33f7ed-c3f8-487d-89dc-4a614d357b86","Type":"ContainerStarted","Data":"ab8ac08a352ba7c52306132a961dc005f5d97d4b5ad01d781149f0cc6c03268f"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.253095 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.255358 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" event={"ID":"72728d52-a8e9-4689-8da0-871f250f7664","Type":"ContainerStarted","Data":"e12e05b18241c328b34665be7687a1cff988c8b8df28f1f75e8e73dc0a32bda6"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.255872 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.259444 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" event={"ID":"6fdeab3e-de35-4d69-9e67-e5d8257bc25d","Type":"ContainerStarted","Data":"d0379e1295a0d61c9b37a457180f1a8d19d6c9ccca5b1538c1481616858f6822"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.259960 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.261577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" event={"ID":"cf0e10ba-c175-44c3-9011-6646f21ba334","Type":"ContainerStarted","Data":"df4608f5e8ec2e972b257e396da911f7ff0e311d7a83103d4787622777e147ef"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.262056 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.263952 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" event={"ID":"f8f1dca9-8b83-469d-b834-3f11376576c9","Type":"ContainerStarted","Data":"de8d7708d5d0fa3791692499a6e7bd93588a0fd28671b624f22bb642a0161890"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.264406 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.266411 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" event={"ID":"34cb38e0-7c0a-4f00-89e9-9be7b394585d","Type":"ContainerStarted","Data":"0b6b01c300734b885eb40f42c6585da05685005a9c4200b81f1a5fa1b8247e48"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.267130 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.268873 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" event={"ID":"12f33757-f329-47a6-9273-bdeb1558a4d7","Type":"ContainerStarted","Data":"05c962eff274c8bae15b00fd2f211bcb968c23ce0185a4df9d19741cd32e5916"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.269289 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.270677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" event={"ID":"7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084","Type":"ContainerStarted","Data":"3d08eff0b78e216b6186846e28e49908982695e75945070513ae6a45211b316f"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.271086 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.272400 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" event={"ID":"7afb870a-75a4-42d5-9704-5cef14dd3ce9","Type":"ContainerStarted","Data":"715bd46432611a2a2686bdeb2c0fe1cbfa0ddd178c6a1f4c23686ad07b8ab806"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.272844 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.274259 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" event={"ID":"a9979be5-6650-425b-a748-51e2cb552413","Type":"ContainerStarted","Data":"93ab62be7513adf352b5e3d097b0ce58852ed7f79e8e5275cc808a99e66e3142"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.274771 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.276277 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" event={"ID":"5fec06f1-8ccf-403c-88de-2b581f056802","Type":"ContainerStarted","Data":"00f242ee1435c337aa5cefd9c22a505457bb5e46910e32802ecbd403a383ad94"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.276741 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.278219 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" event={"ID":"ea986843-26e4-4410-a65e-ae51c02dc04c","Type":"ContainerStarted","Data":"6f73ba4a8bd7cf78757487549e7c5547b7b7ba69a7c509f3697a9f9423047d0e"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.278681 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.280369 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" event={"ID":"fee6970c-0ad7-46ea-ab75-dcb7d552ffbb","Type":"ContainerStarted","Data":"701ff4fddbcab9a3e85b0ffcc55f73ae223df49b5e1484e6866fba8f1a7a60d2"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.280455 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:27 crc kubenswrapper[4962]: E0220 10:11:27.282122 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" podUID="4e2614ed-ea7a-430e-af7b-4d66f05f7b96" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.301841 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" podStartSLOduration=2.353980298 podStartE2EDuration="16.301818705s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:12.078943103 +0000 UTC m=+963.661414949" lastFinishedPulling="2026-02-20 10:11:26.02678151 +0000 UTC m=+977.609253356" observedRunningTime="2026-02-20 10:11:27.294835578 +0000 UTC m=+978.877307424" watchObservedRunningTime="2026-02-20 10:11:27.301818705 +0000 UTC m=+978.884290551" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.359301 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" podStartSLOduration=3.823041464 podStartE2EDuration="16.359281055s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.060870589 +0000 UTC m=+965.643342435" lastFinishedPulling="2026-02-20 10:11:26.59711018 +0000 UTC m=+978.179582026" observedRunningTime="2026-02-20 10:11:27.325326297 +0000 UTC m=+978.907798143" watchObservedRunningTime="2026-02-20 10:11:27.359281055 +0000 UTC m=+978.941752901" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.417121 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:27 crc kubenswrapper[4962]: E0220 10:11:27.418796 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:27 crc kubenswrapper[4962]: E0220 10:11:27.418859 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:43.41884288 +0000 UTC m=+995.001314726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.433163 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" podStartSLOduration=3.323452742 podStartE2EDuration="16.433145704s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.457512997 +0000 UTC m=+965.039984843" lastFinishedPulling="2026-02-20 10:11:26.567205959 +0000 UTC m=+978.149677805" observedRunningTime="2026-02-20 10:11:27.429801641 +0000 UTC m=+979.012273487" watchObservedRunningTime="2026-02-20 10:11:27.433145704 +0000 UTC m=+979.015617550" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.433293 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" podStartSLOduration=3.502887066 podStartE2EDuration="16.433288749s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.096229422 +0000 UTC m=+964.678701268" lastFinishedPulling="2026-02-20 10:11:26.026631105 +0000 UTC m=+977.609102951" observedRunningTime="2026-02-20 10:11:27.374421706 +0000 UTC m=+978.956893562" watchObservedRunningTime="2026-02-20 10:11:27.433288749 +0000 UTC m=+979.015760595" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.571903 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" podStartSLOduration=3.422013097 podStartE2EDuration="16.571887165s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.465099419 +0000 UTC m=+965.047571265" lastFinishedPulling="2026-02-20 10:11:26.614973497 +0000 UTC m=+978.197445333" observedRunningTime="2026-02-20 10:11:27.5707768 +0000 UTC m=+979.153248646" watchObservedRunningTime="2026-02-20 10:11:27.571887165 +0000 UTC m=+979.154359011" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.574896 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" podStartSLOduration=3.997273064 podStartE2EDuration="16.574888269s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.051565864 +0000 UTC m=+965.634037710" lastFinishedPulling="2026-02-20 10:11:26.629181059 +0000 UTC m=+978.211652915" observedRunningTime="2026-02-20 10:11:27.505630322 +0000 UTC m=+979.088102168" watchObservedRunningTime="2026-02-20 10:11:27.574888269 +0000 UTC m=+979.157360115" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.613384 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" podStartSLOduration=4.057935608 podStartE2EDuration="16.613366507s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.471184665 +0000 UTC m=+965.053656511" lastFinishedPulling="2026-02-20 10:11:26.026615574 +0000 UTC m=+977.609087410" observedRunningTime="2026-02-20 10:11:27.609434364 +0000 UTC m=+979.191906210" watchObservedRunningTime="2026-02-20 10:11:27.613366507 +0000 UTC m=+979.195838353" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.649933 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" podStartSLOduration=3.892971619 podStartE2EDuration="16.649916335s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.861784407 +0000 UTC m=+965.444256253" lastFinishedPulling="2026-02-20 10:11:26.618729123 +0000 UTC m=+978.201200969" observedRunningTime="2026-02-20 10:11:27.647957864 +0000 UTC m=+979.230429710" watchObservedRunningTime="2026-02-20 10:11:27.649916335 +0000 UTC m=+979.232388181" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.685350 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" podStartSLOduration=3.807338585 podStartE2EDuration="16.685332528s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.689682331 +0000 UTC m=+965.272154177" lastFinishedPulling="2026-02-20 10:11:26.567676274 +0000 UTC m=+978.150148120" observedRunningTime="2026-02-20 10:11:27.673532761 +0000 UTC m=+979.256004607" watchObservedRunningTime="2026-02-20 10:11:27.685332528 +0000 UTC m=+979.267804374" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.720369 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:27 crc kubenswrapper[4962]: E0220 10:11:27.720853 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:27 crc kubenswrapper[4962]: E0220 10:11:27.720899 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:43.720886706 +0000 UTC m=+995.303358552 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.772317 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" podStartSLOduration=4.0127478 podStartE2EDuration="16.772299066s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.859118935 +0000 UTC m=+965.441590781" lastFinishedPulling="2026-02-20 10:11:26.618670201 +0000 UTC m=+978.201142047" observedRunningTime="2026-02-20 10:11:27.731920849 +0000 UTC m=+979.314392695" watchObservedRunningTime="2026-02-20 10:11:27.772299066 +0000 UTC m=+979.354770912" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.774040 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" podStartSLOduration=4.227947407 podStartE2EDuration="16.774032481s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.481119169 +0000 UTC m=+965.063591015" lastFinishedPulling="2026-02-20 10:11:26.027204243 +0000 UTC m=+977.609676089" observedRunningTime="2026-02-20 10:11:27.769166798 +0000 UTC m=+979.351638644" watchObservedRunningTime="2026-02-20 10:11:27.774032481 +0000 UTC m=+979.356504327" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.853116 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" podStartSLOduration=3.932087269 podStartE2EDuration="16.853099392s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.671469183 +0000 UTC m=+965.253941029" lastFinishedPulling="2026-02-20 10:11:26.592481306 +0000 UTC m=+978.174953152" observedRunningTime="2026-02-20 10:11:27.852326639 +0000 UTC m=+979.434798485" watchObservedRunningTime="2026-02-20 10:11:27.853099392 +0000 UTC m=+979.435571238" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.909850 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" podStartSLOduration=3.776749261 podStartE2EDuration="16.909834029s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.434258375 +0000 UTC m=+965.016730221" lastFinishedPulling="2026-02-20 10:11:26.567343133 +0000 UTC m=+978.149814989" observedRunningTime="2026-02-20 10:11:27.891483418 +0000 UTC m=+979.473955264" watchObservedRunningTime="2026-02-20 10:11:27.909834029 +0000 UTC m=+979.492305875" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.912039 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" podStartSLOduration=4.188378109 podStartE2EDuration="16.912032567s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.859074614 +0000 UTC m=+965.441546460" lastFinishedPulling="2026-02-20 10:11:26.582729082 +0000 UTC m=+978.165200918" observedRunningTime="2026-02-20 10:11:27.908778317 +0000 UTC m=+979.491250163" watchObservedRunningTime="2026-02-20 10:11:27.912032567 +0000 UTC m=+979.494504413" Feb 20 10:11:28 crc kubenswrapper[4962]: I0220 10:11:28.131612 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:28 crc kubenswrapper[4962]: I0220 10:11:28.131734 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:28 crc kubenswrapper[4962]: E0220 10:11:28.131839 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:28 crc kubenswrapper[4962]: E0220 10:11:28.131875 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:28 crc kubenswrapper[4962]: E0220 10:11:28.131922 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:44.131909685 +0000 UTC m=+995.714381531 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:28 crc kubenswrapper[4962]: E0220 10:11:28.131939 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:44.131931126 +0000 UTC m=+995.714402962 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:31 crc kubenswrapper[4962]: I0220 10:11:31.321435 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" event={"ID":"4c8bff11-1a85-4f9b-8fb2-defd04ac22d1","Type":"ContainerStarted","Data":"323f98ce1601de8894336c7bd61186e8d026d66d2bcdc26540de7ad29dbebbf7"} Feb 20 10:11:31 crc kubenswrapper[4962]: I0220 10:11:31.329739 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:31 crc kubenswrapper[4962]: I0220 10:11:31.351354 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" podStartSLOduration=4.242454111 podStartE2EDuration="20.35132573s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.085383539 +0000 UTC m=+965.667855385" lastFinishedPulling="2026-02-20 10:11:30.194255168 +0000 UTC m=+981.776727004" observedRunningTime="2026-02-20 10:11:31.348571304 +0000 UTC m=+982.931043150" watchObservedRunningTime="2026-02-20 10:11:31.35132573 +0000 UTC m=+982.933797586" Feb 20 10:11:31 crc kubenswrapper[4962]: I0220 10:11:31.735292 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:31 crc kubenswrapper[4962]: I0220 10:11:31.764181 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.077730 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.077820 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.143198 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.200485 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.324548 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.346836 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.398043 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" event={"ID":"32d42cbd-4ea1-49cc-b9d4-33fe5f655a16","Type":"ContainerStarted","Data":"06733ac5121ca8b1af3011160578bd66b9ee3b24ddfcd017df1578d5ef88dfb5"} Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.399397 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.400223 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" event={"ID":"14efe385-5147-49ed-a42f-804b91438a55","Type":"ContainerStarted","Data":"7265950e2e793defd3421e8394e26811a3289118f85d0010c7dbb0bf0eb1e2c9"} Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.400482 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.402287 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" event={"ID":"5691d6ef-dedb-4a46-a1b6-0435e9f6db0a","Type":"ContainerStarted","Data":"8081ce6b5031a3b98b6c204d892981565ae720e05862b06dca6f686aa3e02ddb"} Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.404474 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" event={"ID":"4a325f02-ddda-49e9-9ef0-40fd4726b09f","Type":"ContainerStarted","Data":"193e901d7eb4559e021415a15936acae9c3fff27b7cd8d4e6cce3b63f9a46a95"} Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.404759 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.425288 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" podStartSLOduration=3.851730459 podStartE2EDuration="26.425268137s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.075520577 +0000 UTC m=+965.657992423" lastFinishedPulling="2026-02-20 10:11:36.649058235 +0000 UTC m=+988.231530101" observedRunningTime="2026-02-20 10:11:37.419449516 +0000 UTC m=+989.001921362" watchObservedRunningTime="2026-02-20 10:11:37.425268137 +0000 UTC m=+989.007739983" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.438946 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" podStartSLOduration=3.834117148 podStartE2EDuration="26.438925972s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.06219551 +0000 UTC m=+965.644667356" lastFinishedPulling="2026-02-20 10:11:36.667004334 +0000 UTC m=+988.249476180" observedRunningTime="2026-02-20 10:11:37.437642403 +0000 UTC m=+989.020114249" watchObservedRunningTime="2026-02-20 10:11:37.438925972 +0000 UTC m=+989.021397818" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.468078 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" podStartSLOduration=2.917444821 podStartE2EDuration="25.468049719s" podCreationTimestamp="2026-02-20 10:11:12 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.065904643 +0000 UTC m=+965.648376479" lastFinishedPulling="2026-02-20 10:11:36.616509531 +0000 UTC m=+988.198981377" observedRunningTime="2026-02-20 10:11:37.459502263 +0000 UTC m=+989.041974129" watchObservedRunningTime="2026-02-20 10:11:37.468049719 +0000 UTC m=+989.050521565" Feb 20 10:11:38 crc kubenswrapper[4962]: I0220 10:11:38.175250 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" podStartSLOduration=4.625262262 podStartE2EDuration="27.175210371s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.065536581 +0000 UTC m=+965.648008417" lastFinishedPulling="2026-02-20 10:11:36.61548464 +0000 UTC m=+988.197956526" observedRunningTime="2026-02-20 10:11:37.485335598 +0000 UTC m=+989.067807464" watchObservedRunningTime="2026-02-20 10:11:38.175210371 +0000 UTC m=+989.757682257" Feb 20 10:11:39 crc kubenswrapper[4962]: I0220 10:11:39.426023 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" event={"ID":"4e2614ed-ea7a-430e-af7b-4d66f05f7b96","Type":"ContainerStarted","Data":"06a7b62b8e10b7a63bc55092190701acc90fd2479affaadec1a66e80ed258450"} Feb 20 10:11:39 crc kubenswrapper[4962]: I0220 10:11:39.426605 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:39 crc kubenswrapper[4962]: I0220 10:11:39.459054 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" podStartSLOduration=3.352429047 podStartE2EDuration="28.459038021s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.474681972 +0000 UTC m=+965.057153818" lastFinishedPulling="2026-02-20 10:11:38.581290936 +0000 UTC m=+990.163762792" observedRunningTime="2026-02-20 10:11:39.455981245 +0000 UTC m=+991.038453091" watchObservedRunningTime="2026-02-20 10:11:39.459038021 +0000 UTC m=+991.041509867" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.640040 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.713279 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.715673 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.850018 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.888515 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.982052 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:42 crc kubenswrapper[4962]: I0220 10:11:42.149243 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:42 crc kubenswrapper[4962]: I0220 10:11:42.323019 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:42 crc kubenswrapper[4962]: I0220 10:11:42.383221 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:42 crc kubenswrapper[4962]: I0220 10:11:42.410142 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.514443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.522698 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.660695 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.827290 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.836083 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.929869 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn"] Feb 20 10:11:43 crc kubenswrapper[4962]: W0220 10:11:43.935347 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c8c62e9_0201_43a4_b823_82af87a0977e.slice/crio-d6cae2c92f4d956792e194551d455fbc10f25f338829b86088953eab0f7871e9 WatchSource:0}: Error finding container d6cae2c92f4d956792e194551d455fbc10f25f338829b86088953eab0f7871e9: Status 404 returned error can't find the container with id d6cae2c92f4d956792e194551d455fbc10f25f338829b86088953eab0f7871e9 Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.030923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.133982 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.136992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.143066 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.144148 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.174968 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.358901 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf"] Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.462042 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" event={"ID":"0c8c62e9-0201-43a4-b823-82af87a0977e","Type":"ContainerStarted","Data":"d6cae2c92f4d956792e194551d455fbc10f25f338829b86088953eab0f7871e9"} Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.463909 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" event={"ID":"f8de466d-f069-4a8e-8598-72a163525c24","Type":"ContainerStarted","Data":"4dd45bf4f16770003ef342051144f96b6bd4f911216d98b705f544aee98ffb8a"} Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.686053 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz"] Feb 20 10:11:45 crc kubenswrapper[4962]: I0220 10:11:45.474833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" event={"ID":"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98","Type":"ContainerStarted","Data":"7de56f10d1f3bd38a08860549618e484c42477cb0396189ac3b1bcc230588146"} Feb 20 10:11:49 crc kubenswrapper[4962]: I0220 10:11:49.517276 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" event={"ID":"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98","Type":"ContainerStarted","Data":"92f02ef4ba51657d3253e259e7fc434f227c78da0dc3958b52516ed83a52be41"} Feb 20 10:11:49 crc kubenswrapper[4962]: I0220 10:11:49.518264 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:49 crc kubenswrapper[4962]: I0220 10:11:49.582140 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" podStartSLOduration=37.582117032 podStartE2EDuration="37.582117032s" podCreationTimestamp="2026-02-20 10:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:11:49.569368174 +0000 UTC m=+1001.151840060" watchObservedRunningTime="2026-02-20 10:11:49.582117032 +0000 UTC m=+1001.164588878" Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.170841 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.546274 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" event={"ID":"f8de466d-f069-4a8e-8598-72a163525c24","Type":"ContainerStarted","Data":"d63620977efe9a7117d0118bf8371fbbec4315da5ba0d7b8d1c93e5502dc7733"} Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.546394 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.547939 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" event={"ID":"0c8c62e9-0201-43a4-b823-82af87a0977e","Type":"ContainerStarted","Data":"652ad8c6326d73ac5d1f5c1080a1e90ea6fa94709991fe368150108388a3e850"} Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.548089 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.608928 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" podStartSLOduration=34.436460643 podStartE2EDuration="41.608901999s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:44.376708001 +0000 UTC m=+995.959179847" lastFinishedPulling="2026-02-20 10:11:51.549149347 +0000 UTC m=+1003.131621203" observedRunningTime="2026-02-20 10:11:52.583876489 +0000 UTC m=+1004.166348345" watchObservedRunningTime="2026-02-20 10:11:52.608901999 +0000 UTC m=+1004.191373845" Feb 20 10:11:54 crc kubenswrapper[4962]: I0220 10:11:54.184801 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:54 crc kubenswrapper[4962]: I0220 10:11:54.229113 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" podStartSLOduration=35.645013658 podStartE2EDuration="43.229095772s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:43.93833514 +0000 UTC m=+995.520806976" lastFinishedPulling="2026-02-20 10:11:51.522417234 +0000 UTC m=+1003.104889090" observedRunningTime="2026-02-20 10:11:52.605064469 +0000 UTC m=+1004.187536315" watchObservedRunningTime="2026-02-20 10:11:54.229095772 +0000 UTC m=+1005.811567618" Feb 20 10:12:03 crc kubenswrapper[4962]: I0220 10:12:03.668519 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:12:04 crc kubenswrapper[4962]: I0220 10:12:04.041068 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.958050 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.960335 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.965426 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.965544 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.965725 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.965954 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-pbqd6" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.968024 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.025475 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.026923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.031098 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.042934 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.112886 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qd6h\" (UniqueName: \"kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.112948 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.113056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.113227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m85l\" (UniqueName: \"kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.113330 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.215625 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qd6h\" (UniqueName: \"kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.215693 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.215734 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.215804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m85l\" (UniqueName: \"kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.215936 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.216732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.217323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.217502 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.237581 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m85l\" (UniqueName: \"kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.247130 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qd6h\" (UniqueName: \"kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.283506 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.353309 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.590087 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.604684 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.870453 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" event={"ID":"35f03c4f-de3b-4981-9e78-b8d1a1d171b5","Type":"ContainerStarted","Data":"153e2efb6d99e57bdd8c71d555149537a37f6ac8ec26c3492f416c36ef39e106"} Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.889005 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:21 crc kubenswrapper[4962]: W0220 10:12:21.903990 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd99be0c7_0310_4fa4_9426_63be765a9e85.slice/crio-c8edb9d91df0b22ff9505de5c7b80ee2daed2d5e1bd99677fbcf778baf5be2bc WatchSource:0}: Error finding container c8edb9d91df0b22ff9505de5c7b80ee2daed2d5e1bd99677fbcf778baf5be2bc: Status 404 returned error can't find the container with id c8edb9d91df0b22ff9505de5c7b80ee2daed2d5e1bd99677fbcf778baf5be2bc Feb 20 10:12:22 crc kubenswrapper[4962]: I0220 10:12:22.883227 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" event={"ID":"d99be0c7-0310-4fa4-9426-63be765a9e85","Type":"ContainerStarted","Data":"c8edb9d91df0b22ff9505de5c7b80ee2daed2d5e1bd99677fbcf778baf5be2bc"} Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.038557 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.070481 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.072735 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.098842 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.155906 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.155977 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dbrr\" (UniqueName: \"kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.156009 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.263312 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.263370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dbrr\" (UniqueName: \"kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.263409 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.264613 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.266221 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.289818 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dbrr\" (UniqueName: \"kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.394187 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.831129 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.852838 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.855526 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.878452 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.878526 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.878561 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prlr9\" (UniqueName: \"kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.882047 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.894675 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.980464 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.980581 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.980630 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prlr9\" (UniqueName: \"kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.981897 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.984734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.009368 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prlr9\" (UniqueName: \"kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.192468 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.223871 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.225111 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.229692 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.230015 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.231931 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.231973 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-d7jzr" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.232018 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.232298 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.232430 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.239666 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285476 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285544 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285578 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285612 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcvhk\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285647 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285675 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285711 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285757 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285779 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285805 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388062 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388694 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388719 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388754 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388837 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388865 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcvhk\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388970 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.389013 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.389047 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.389957 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.390271 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.390641 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.390993 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.396172 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.396260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.398775 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.399652 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.399736 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.400652 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.412051 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcvhk\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.466836 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.572188 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.754304 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:12:24 crc kubenswrapper[4962]: W0220 10:12:24.783175 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb061854a_f0c6_4754_a947_a7d5408f25db.slice/crio-2cc17bab5a42964e336fc25e4f9d43353486628812f1bad5c2e3d1b82435adbf WatchSource:0}: Error finding container 2cc17bab5a42964e336fc25e4f9d43353486628812f1bad5c2e3d1b82435adbf: Status 404 returned error can't find the container with id 2cc17bab5a42964e336fc25e4f9d43353486628812f1bad5c2e3d1b82435adbf Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.934820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" event={"ID":"b061854a-f0c6-4754-a947-a7d5408f25db","Type":"ContainerStarted","Data":"2cc17bab5a42964e336fc25e4f9d43353486628812f1bad5c2e3d1b82435adbf"} Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.937715 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" event={"ID":"01d0cdce-fd47-471a-94af-ee68fed6a2aa","Type":"ContainerStarted","Data":"6ee62349849ee2a01e9e7674d3fdcbef155f78a8a88598da3702e8fea9005811"} Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.977730 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.980369 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.983710 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.983873 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.983947 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.984907 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.985690 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.987445 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tbhds" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.987706 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.998997 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001391 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001425 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001457 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001473 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001506 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001525 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.003646 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.003756 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.003778 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hckp\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.003801 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.085169 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:12:25 crc kubenswrapper[4962]: W0220 10:12:25.085725 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a8d652d_aea8_4a83_b33e_0d2522af0be8.slice/crio-b402e19dca07d8ba27eec1161345a129c0a3f56fa63c23ac6f8b1e82180c9e7c WatchSource:0}: Error finding container b402e19dca07d8ba27eec1161345a129c0a3f56fa63c23ac6f8b1e82180c9e7c: Status 404 returned error can't find the container with id b402e19dca07d8ba27eec1161345a129c0a3f56fa63c23ac6f8b1e82180c9e7c Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106456 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106603 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106669 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106693 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hckp\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106720 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106757 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106781 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106803 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106853 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106875 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.109264 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.109285 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.110544 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.110775 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.110887 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.113190 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.115833 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.119292 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.122749 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.122876 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.130438 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hckp\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.149496 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.320156 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.935016 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:12:25 crc kubenswrapper[4962]: W0220 10:12:25.966047 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56a77dd3_ef10_46a6_a00d_ab38af0d4338.slice/crio-1af76abb62f306cbfdd579814518b0ea666f529247ac9e64fd984f69498132b5 WatchSource:0}: Error finding container 1af76abb62f306cbfdd579814518b0ea666f529247ac9e64fd984f69498132b5: Status 404 returned error can't find the container with id 1af76abb62f306cbfdd579814518b0ea666f529247ac9e64fd984f69498132b5 Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.972794 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerStarted","Data":"b402e19dca07d8ba27eec1161345a129c0a3f56fa63c23ac6f8b1e82180c9e7c"} Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.396745 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.398367 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.405806 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.405926 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-898r2" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.407036 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.408027 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.410383 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.411760 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537447 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537536 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537569 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537619 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537641 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537669 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537692 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537720 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zkm4\" (UniqueName: \"kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638702 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638759 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638793 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zkm4\" (UniqueName: \"kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638868 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638894 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638929 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638965 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.642253 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.642956 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.643435 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.643450 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.643758 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.671802 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.685692 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.706928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zkm4\" (UniqueName: \"kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.712821 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.729909 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.007199 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerStarted","Data":"1af76abb62f306cbfdd579814518b0ea666f529247ac9e64fd984f69498132b5"} Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.477962 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.746888 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.749838 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.754298 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.756356 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.757156 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4svgj" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.759193 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.759296 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868197 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868262 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps9mx\" (UniqueName: \"kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868318 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868345 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868435 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868467 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.870560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.870625 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.972970 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973042 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973103 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973153 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps9mx\" (UniqueName: \"kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973177 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.974560 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.975028 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.975045 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.975661 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.976361 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.983776 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.991424 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.993837 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps9mx\" (UniqueName: \"kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.026953 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.075321 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.083689 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.088046 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.090842 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.091141 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-njzvl" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.091512 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.094285 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.176622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65mll\" (UniqueName: \"kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.176690 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.176726 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.176755 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.176790 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.282190 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.282348 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65mll\" (UniqueName: \"kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.282404 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.282491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.282527 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.283387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.283923 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.300669 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.303654 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65mll\" (UniqueName: \"kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.307709 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.416053 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.245897 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.247176 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.253092 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5f72b" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.256143 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.336498 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqfr2\" (UniqueName: \"kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2\") pod \"kube-state-metrics-0\" (UID: \"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc\") " pod="openstack/kube-state-metrics-0" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.438811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqfr2\" (UniqueName: \"kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2\") pod \"kube-state-metrics-0\" (UID: \"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc\") " pod="openstack/kube-state-metrics-0" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.461654 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqfr2\" (UniqueName: \"kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2\") pod \"kube-state-metrics-0\" (UID: \"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc\") " pod="openstack/kube-state-metrics-0" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.575300 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.801760 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.803302 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.805906 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-5c7mx" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.806076 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.806768 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.816680 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.847905 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.847975 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.848006 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.848046 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.848124 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.848180 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l52h9\" (UniqueName: \"kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.848208 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.920876 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.926540 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.944136 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949839 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949886 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949909 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949931 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949953 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l52h9\" (UniqueName: \"kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.950002 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.952036 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.952044 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.952219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.952333 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.960162 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.962035 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.964898 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.965645 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.966224 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.966458 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-gw5b6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.966608 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.966751 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.968848 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.973219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l52h9\" (UniqueName: \"kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.986455 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051489 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051578 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051627 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051666 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051685 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051704 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z44jc\" (UniqueName: \"kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051752 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051775 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051797 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051820 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051841 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd882\" (UniqueName: \"kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051882 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.083233 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerStarted","Data":"9629cf6fabd95f146380c31c7bc910c7de73918acc62bb7e7fbe72c4774cfa18"} Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154082 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154657 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154682 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154706 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z44jc\" (UniqueName: \"kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154769 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154798 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154830 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154852 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154883 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd882\" (UniqueName: \"kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154940 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154977 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.155020 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.155037 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.155491 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.156035 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.157117 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.157236 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.157329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.160230 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.160635 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.162859 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.168702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.168981 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.171187 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.171277 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.171709 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.176228 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z44jc\" (UniqueName: \"kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.180651 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd882\" (UniqueName: \"kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.183928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.255982 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.323548 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.926919 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.930492 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.936315 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.936454 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-58l5v" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.937288 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.946140 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.961214 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.130774 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.130887 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.131286 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.131733 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrg9\" (UniqueName: \"kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.132043 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.132173 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.132202 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.132377 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234793 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234873 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234916 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234948 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.235012 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrg9\" (UniqueName: \"kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.235515 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.235907 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.238469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.239817 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.249740 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.250061 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.254403 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.257207 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrg9\" (UniqueName: \"kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.268995 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.277298 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:41 crc kubenswrapper[4962]: I0220 10:12:41.507843 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:12:41 crc kubenswrapper[4962]: I0220 10:12:41.508138 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.900721 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.900721 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.901370 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2hckp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(56a77dd3-ef10-46a6-a00d-ab38af0d4338): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.901508 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rcvhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(2a8d652d-aea8-4a83-b33e-0d2522af0be8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.903916 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.903921 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" Feb 20 10:12:47 crc kubenswrapper[4962]: E0220 10:12:47.216548 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76\\\"\"" pod="openstack/rabbitmq-server-0" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" Feb 20 10:12:47 crc kubenswrapper[4962]: E0220 10:12:47.216646 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" Feb 20 10:12:47 crc kubenswrapper[4962]: E0220 10:12:47.843189 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 10:12:47 crc kubenswrapper[4962]: E0220 10:12:47.843471 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4m85l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-855cbc58c5-jg2pc_openstack(35f03c4f-de3b-4981-9e78-b8d1a1d171b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:47 crc kubenswrapper[4962]: E0220 10:12:47.844769 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" podUID="35f03c4f-de3b-4981-9e78-b8d1a1d171b5" Feb 20 10:12:49 crc kubenswrapper[4962]: E0220 10:12:49.975458 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 10:12:49 crc kubenswrapper[4962]: E0220 10:12:49.976468 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4qd6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcf94d689-stkxf_openstack(d99be0c7-0310-4fa4-9426-63be765a9e85): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:49 crc kubenswrapper[4962]: E0220 10:12:49.978038 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" podUID="d99be0c7-0310-4fa4-9426-63be765a9e85" Feb 20 10:12:49 crc kubenswrapper[4962]: E0220 10:12:49.999367 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:49.999819 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prlr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-67ff45466c-svjlj_openstack(b061854a-f0c6-4754-a947-a7d5408f25db): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.003798 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" podUID="b061854a-f0c6-4754-a947-a7d5408f25db" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.052433 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.052699 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5dbrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-f54874ffc-2drvh_openstack(01d0cdce-fd47-471a-94af-ee68fed6a2aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.053994 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" podUID="01d0cdce-fd47-471a-94af-ee68fed6a2aa" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.148051 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.241180 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" event={"ID":"35f03c4f-de3b-4981-9e78-b8d1a1d171b5","Type":"ContainerDied","Data":"153e2efb6d99e57bdd8c71d555149537a37f6ac8ec26c3492f416c36ef39e106"} Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.241248 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.242281 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" podUID="01d0cdce-fd47-471a-94af-ee68fed6a2aa" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.242764 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" podUID="b061854a-f0c6-4754-a947-a7d5408f25db" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.311135 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config\") pod \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.311656 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m85l\" (UniqueName: \"kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l\") pod \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.313126 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config" (OuterVolumeSpecName: "config") pod "35f03c4f-de3b-4981-9e78-b8d1a1d171b5" (UID: "35f03c4f-de3b-4981-9e78-b8d1a1d171b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.322257 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l" (OuterVolumeSpecName: "kube-api-access-4m85l") pod "35f03c4f-de3b-4981-9e78-b8d1a1d171b5" (UID: "35f03c4f-de3b-4981-9e78-b8d1a1d171b5"). InnerVolumeSpecName "kube-api-access-4m85l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.413310 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.413345 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m85l\" (UniqueName: \"kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l\") on node \"crc\" DevicePath \"\"" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.453429 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.588832 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.608624 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.628825 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.635425 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.748320 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:12:50 crc kubenswrapper[4962]: W0220 10:12:50.773796 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7df7b95_a5ed_4e4e_81f0_9f718bab0bcc.slice/crio-13a063091fc03c82fcdfd2aec7fd6dc34c81370ecc1a377f3c146895e371bf9d WatchSource:0}: Error finding container 13a063091fc03c82fcdfd2aec7fd6dc34c81370ecc1a377f3c146895e371bf9d: Status 404 returned error can't find the container with id 13a063091fc03c82fcdfd2aec7fd6dc34c81370ecc1a377f3c146895e371bf9d Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.803124 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.923869 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc\") pod \"d99be0c7-0310-4fa4-9426-63be765a9e85\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.924410 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qd6h\" (UniqueName: \"kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h\") pod \"d99be0c7-0310-4fa4-9426-63be765a9e85\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.924462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config\") pod \"d99be0c7-0310-4fa4-9426-63be765a9e85\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.924473 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d99be0c7-0310-4fa4-9426-63be765a9e85" (UID: "d99be0c7-0310-4fa4-9426-63be765a9e85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.924893 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config" (OuterVolumeSpecName: "config") pod "d99be0c7-0310-4fa4-9426-63be765a9e85" (UID: "d99be0c7-0310-4fa4-9426-63be765a9e85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.924942 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.739400 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.756055 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f03c4f-de3b-4981-9e78-b8d1a1d171b5" path="/var/lib/kubelet/pods/35f03c4f-de3b-4981-9e78-b8d1a1d171b5/volumes" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.778660 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h" (OuterVolumeSpecName: "kube-api-access-4qd6h") pod "d99be0c7-0310-4fa4-9426-63be765a9e85" (UID: "d99be0c7-0310-4fa4-9426-63be765a9e85"). InnerVolumeSpecName "kube-api-access-4qd6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.786703 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.823962 4962 kubelet_pods.go:2476] "Failed to reduce cpu time for pod pending volume cleanup" podUID="d99be0c7-0310-4fa4-9426-63be765a9e85" err="openat2 /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd99be0c7_0310_4fa4_9426_63be765a9e85.slice/cgroup.controllers: no such file or directory" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.824049 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerStarted","Data":"f6e6a97dcf3e2888aaf774e41bb7caae5d9537602046e6592fd534041d6392a2"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.824076 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerStarted","Data":"d0a92b505f163c98c2579b38133407e2587dcd82e4a7d6302d1e3ca2e2112d68"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.824086 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" event={"ID":"d99be0c7-0310-4fa4-9426-63be765a9e85","Type":"ContainerDied","Data":"c8edb9d91df0b22ff9505de5c7b80ee2daed2d5e1bd99677fbcf778baf5be2bc"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.824102 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6" event={"ID":"383d4f1e-72b3-48ce-9427-0361c19e41fc","Type":"ContainerStarted","Data":"1ed5bd754fe42b78759f03224b6a39f1b92d8d484574e9a6557ab622debe2a23"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.824115 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.844348 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qd6h\" (UniqueName: \"kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h\") on node \"crc\" DevicePath \"\"" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.868678 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerStarted","Data":"5738934c1190f3f4ebf6be3609b1f56189c1c53ad8ccc9348121e92913c3ec72"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.874662 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b22a9e86-ccdf-4505-8116-21b0230943fc","Type":"ContainerStarted","Data":"527bc0b9350edbbd23edfe05a933e12b44f8d4ad0c70495feffaffb9052c4070"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.875772 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc","Type":"ContainerStarted","Data":"13a063091fc03c82fcdfd2aec7fd6dc34c81370ecc1a377f3c146895e371bf9d"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.919672 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.993503 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:52 crc kubenswrapper[4962]: I0220 10:12:52.007335 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:52 crc kubenswrapper[4962]: I0220 10:12:52.810188 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:12:52 crc kubenswrapper[4962]: I0220 10:12:52.887325 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerStarted","Data":"a2d2a8a63bf5c9ebd610b16b09ca46a05d03ae717f57b9ce876334d685870041"} Feb 20 10:12:52 crc kubenswrapper[4962]: I0220 10:12:52.889954 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerStarted","Data":"2226a3425cb913ac33dc3114a16db2100facfc7423dff93548d53775b718e6e2"} Feb 20 10:12:53 crc kubenswrapper[4962]: I0220 10:12:53.152872 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d99be0c7-0310-4fa4-9426-63be765a9e85" path="/var/lib/kubelet/pods/d99be0c7-0310-4fa4-9426-63be765a9e85/volumes" Feb 20 10:12:53 crc kubenswrapper[4962]: I0220 10:12:53.902835 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerStarted","Data":"b4d03ac8272f687d64246b8c3c40efcac57552a3657ef2ee1db4c3625f47035c"} Feb 20 10:12:54 crc kubenswrapper[4962]: I0220 10:12:54.914308 4962 generic.go:334] "Generic (PLEG): container finished" podID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerID="5738934c1190f3f4ebf6be3609b1f56189c1c53ad8ccc9348121e92913c3ec72" exitCode=0 Feb 20 10:12:54 crc kubenswrapper[4962]: I0220 10:12:54.914398 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerDied","Data":"5738934c1190f3f4ebf6be3609b1f56189c1c53ad8ccc9348121e92913c3ec72"} Feb 20 10:12:55 crc kubenswrapper[4962]: I0220 10:12:55.924817 4962 generic.go:334] "Generic (PLEG): container finished" podID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerID="f6e6a97dcf3e2888aaf774e41bb7caae5d9537602046e6592fd534041d6392a2" exitCode=0 Feb 20 10:12:55 crc kubenswrapper[4962]: I0220 10:12:55.925059 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerDied","Data":"f6e6a97dcf3e2888aaf774e41bb7caae5d9537602046e6592fd534041d6392a2"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.949712 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerStarted","Data":"b7ff4938197d4ffeb1d0dead4cb76392b4c2fbfcd796b8766f3dbd1e8efbaf48"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.952575 4962 generic.go:334] "Generic (PLEG): container finished" podID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerID="b6626b3616a8427737e8c790adcc57ad3f4d0385df8b472ffc49fd4bd021b003" exitCode=0 Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.952724 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerDied","Data":"b6626b3616a8427737e8c790adcc57ad3f4d0385df8b472ffc49fd4bd021b003"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.956826 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerStarted","Data":"a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.958955 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6" event={"ID":"383d4f1e-72b3-48ce-9427-0361c19e41fc","Type":"ContainerStarted","Data":"d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.959472 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.963152 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerStarted","Data":"9a823554a8f72450a8956f74b11a494798fb5f7fc99300ed38421760066cc712"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.968167 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerStarted","Data":"0ee4c6895eaf367e01ee1ab962d5fa0868b6b165760c399d39cc5c1615f1960b"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.981315 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b22a9e86-ccdf-4505-8116-21b0230943fc","Type":"ContainerStarted","Data":"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.981417 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.984433 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc","Type":"ContainerStarted","Data":"fb6cfde9cbec99e03a3f009355d709416019b4ef6eb2150c6b7e98f530e8b57a"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.984584 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.036888 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=16.679746766 podStartE2EDuration="33.036864101s" podCreationTimestamp="2026-02-20 10:12:25 +0000 UTC" firstStartedPulling="2026-02-20 10:12:33.695694656 +0000 UTC m=+1045.278166502" lastFinishedPulling="2026-02-20 10:12:50.052811981 +0000 UTC m=+1061.635283837" observedRunningTime="2026-02-20 10:12:58.034323042 +0000 UTC m=+1069.616794888" watchObservedRunningTime="2026-02-20 10:12:58.036864101 +0000 UTC m=+1069.619335947" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.038024 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=32.038017247 podStartE2EDuration="32.038017247s" podCreationTimestamp="2026-02-20 10:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:12:58.003311786 +0000 UTC m=+1069.585783672" watchObservedRunningTime="2026-02-20 10:12:58.038017247 +0000 UTC m=+1069.620489093" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.063979 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wj9f6" podStartSLOduration=19.280038419 podStartE2EDuration="25.063944944s" podCreationTimestamp="2026-02-20 10:12:33 +0000 UTC" firstStartedPulling="2026-02-20 10:12:50.592826298 +0000 UTC m=+1062.175298164" lastFinishedPulling="2026-02-20 10:12:56.376732843 +0000 UTC m=+1067.959204689" observedRunningTime="2026-02-20 10:12:58.059106974 +0000 UTC m=+1069.641578830" watchObservedRunningTime="2026-02-20 10:12:58.063944944 +0000 UTC m=+1069.646416790" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.077309 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.077350 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.083755 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=21.721689741 podStartE2EDuration="28.083741521s" podCreationTimestamp="2026-02-20 10:12:30 +0000 UTC" firstStartedPulling="2026-02-20 10:12:50.780056258 +0000 UTC m=+1062.362528104" lastFinishedPulling="2026-02-20 10:12:57.142108008 +0000 UTC m=+1068.724579884" observedRunningTime="2026-02-20 10:12:58.081058587 +0000 UTC m=+1069.663530433" watchObservedRunningTime="2026-02-20 10:12:58.083741521 +0000 UTC m=+1069.666213367" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.111272 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=24.497626665 podStartE2EDuration="30.111238517s" podCreationTimestamp="2026-02-20 10:12:28 +0000 UTC" firstStartedPulling="2026-02-20 10:12:50.470239901 +0000 UTC m=+1062.052711747" lastFinishedPulling="2026-02-20 10:12:56.083851743 +0000 UTC m=+1067.666323599" observedRunningTime="2026-02-20 10:12:58.101938228 +0000 UTC m=+1069.684410094" watchObservedRunningTime="2026-02-20 10:12:58.111238517 +0000 UTC m=+1069.693710403" Feb 20 10:12:59 crc kubenswrapper[4962]: I0220 10:12:59.004524 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerStarted","Data":"fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38"} Feb 20 10:12:59 crc kubenswrapper[4962]: I0220 10:12:59.005686 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:59 crc kubenswrapper[4962]: I0220 10:12:59.006117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerStarted","Data":"0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2"} Feb 20 10:12:59 crc kubenswrapper[4962]: I0220 10:12:59.038198 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-r7g9h" podStartSLOduration=22.673265496 podStartE2EDuration="26.038169832s" podCreationTimestamp="2026-02-20 10:12:33 +0000 UTC" firstStartedPulling="2026-02-20 10:12:53.652646463 +0000 UTC m=+1065.235118309" lastFinishedPulling="2026-02-20 10:12:57.017550799 +0000 UTC m=+1068.600022645" observedRunningTime="2026-02-20 10:12:59.031587518 +0000 UTC m=+1070.614059374" watchObservedRunningTime="2026-02-20 10:12:59.038169832 +0000 UTC m=+1070.620641688" Feb 20 10:12:59 crc kubenswrapper[4962]: I0220 10:12:59.256844 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:13:00 crc kubenswrapper[4962]: I0220 10:13:00.018833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerStarted","Data":"e9de55d709a0309b4fcbcb74a44dfc77cc45f95d7066591c4a40dc2b0ceb9eed"} Feb 20 10:13:00 crc kubenswrapper[4962]: I0220 10:13:00.022876 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerStarted","Data":"a2580fff2ba1ecc29418d1a47b14ce5d8459c470e24eee4d2ebced1a648dc3a8"} Feb 20 10:13:00 crc kubenswrapper[4962]: I0220 10:13:00.095102 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=16.579625857 podStartE2EDuration="24.095065905s" podCreationTimestamp="2026-02-20 10:12:36 +0000 UTC" firstStartedPulling="2026-02-20 10:12:51.868226024 +0000 UTC m=+1063.450697870" lastFinishedPulling="2026-02-20 10:12:59.383666062 +0000 UTC m=+1070.966137918" observedRunningTime="2026-02-20 10:13:00.086304042 +0000 UTC m=+1071.668775918" watchObservedRunningTime="2026-02-20 10:13:00.095065905 +0000 UTC m=+1071.677537791" Feb 20 10:13:00 crc kubenswrapper[4962]: I0220 10:13:00.100891 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=20.68500445 podStartE2EDuration="28.100839345s" podCreationTimestamp="2026-02-20 10:12:32 +0000 UTC" firstStartedPulling="2026-02-20 10:12:51.961241891 +0000 UTC m=+1063.543713737" lastFinishedPulling="2026-02-20 10:12:59.377076786 +0000 UTC m=+1070.959548632" observedRunningTime="2026-02-20 10:13:00.049134665 +0000 UTC m=+1071.631606511" watchObservedRunningTime="2026-02-20 10:13:00.100839345 +0000 UTC m=+1071.683311221" Feb 20 10:13:01 crc kubenswrapper[4962]: I0220 10:13:01.324105 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 20 10:13:01 crc kubenswrapper[4962]: I0220 10:13:01.372380 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.049535 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerStarted","Data":"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e"} Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.053576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerStarted","Data":"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857"} Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.054124 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.122308 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.277652 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.379259 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.453171 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.474677 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.476206 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.478425 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.499691 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.532685 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.534749 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.545021 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.567056 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628098 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628166 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628211 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628288 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628327 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxgrm\" (UniqueName: \"kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628374 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628418 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628439 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn4cm\" (UniqueName: \"kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628456 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628483 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730637 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730717 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730766 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730797 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730849 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730891 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxgrm\" (UniqueName: \"kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730928 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.731015 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn4cm\" (UniqueName: \"kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.731043 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.731338 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.731887 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.731915 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.732259 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.738223 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.739930 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.740206 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.741581 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.749991 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxgrm\" (UniqueName: \"kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.750717 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn4cm\" (UniqueName: \"kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.829931 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.844678 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.858719 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.939819 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config\") pod \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.939947 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dbrr\" (UniqueName: \"kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr\") pod \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.939982 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc\") pod \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.940937 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01d0cdce-fd47-471a-94af-ee68fed6a2aa" (UID: "01d0cdce-fd47-471a-94af-ee68fed6a2aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.941350 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config" (OuterVolumeSpecName: "config") pod "01d0cdce-fd47-471a-94af-ee68fed6a2aa" (UID: "01d0cdce-fd47-471a-94af-ee68fed6a2aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.953177 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr" (OuterVolumeSpecName: "kube-api-access-5dbrr") pod "01d0cdce-fd47-471a-94af-ee68fed6a2aa" (UID: "01d0cdce-fd47-471a-94af-ee68fed6a2aa"). InnerVolumeSpecName "kube-api-access-5dbrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.962967 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.996700 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.006034 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.009619 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.049345 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.052051 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.052097 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dbrr\" (UniqueName: \"kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.052113 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.101074 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" event={"ID":"01d0cdce-fd47-471a-94af-ee68fed6a2aa","Type":"ContainerDied","Data":"6ee62349849ee2a01e9e7674d3fdcbef155f78a8a88598da3702e8fea9005811"} Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.101171 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.102001 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.153322 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.153421 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.153455 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.153501 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxq9l\" (UniqueName: \"kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.153532 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.158822 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.212769 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.219706 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.254873 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxq9l\" (UniqueName: \"kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.254968 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.255076 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.255131 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.255181 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.258326 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.258876 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.259123 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.259724 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.280937 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxq9l\" (UniqueName: \"kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.396782 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.414300 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.421544 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.465682 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.488382 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.497469 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.504742 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.505458 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.505584 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-knfns" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.505711 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.527397 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:13:03 crc kubenswrapper[4962]: W0220 10:13:03.537930 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88c21489_524e_4ee7_a340_5be2573af161.slice/crio-f6763fa902e28879cc4359d1b1acc4ff238f733e4bd8236ae411565bdfb3ac57 WatchSource:0}: Error finding container f6763fa902e28879cc4359d1b1acc4ff238f733e4bd8236ae411565bdfb3ac57: Status 404 returned error can't find the container with id f6763fa902e28879cc4359d1b1acc4ff238f733e4bd8236ae411565bdfb3ac57 Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.557346 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.567476 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc\") pod \"b061854a-f0c6-4754-a947-a7d5408f25db\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.567528 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config\") pod \"b061854a-f0c6-4754-a947-a7d5408f25db\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.567600 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prlr9\" (UniqueName: \"kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9\") pod \"b061854a-f0c6-4754-a947-a7d5408f25db\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.567960 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568045 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568107 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568157 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568217 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568236 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r87b2\" (UniqueName: \"kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568288 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config" (OuterVolumeSpecName: "config") pod "b061854a-f0c6-4754-a947-a7d5408f25db" (UID: "b061854a-f0c6-4754-a947-a7d5408f25db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.572917 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9" (OuterVolumeSpecName: "kube-api-access-prlr9") pod "b061854a-f0c6-4754-a947-a7d5408f25db" (UID: "b061854a-f0c6-4754-a947-a7d5408f25db"). InnerVolumeSpecName "kube-api-access-prlr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.573176 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b061854a-f0c6-4754-a947-a7d5408f25db" (UID: "b061854a-f0c6-4754-a947-a7d5408f25db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670481 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670564 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670635 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670670 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670695 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r87b2\" (UniqueName: \"kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670795 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670877 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670893 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670905 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prlr9\" (UniqueName: \"kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.673078 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.673549 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.676500 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.686613 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.686984 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.690057 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.693095 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r87b2\" (UniqueName: \"kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.833005 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.959435 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.105434 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" event={"ID":"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5","Type":"ContainerStarted","Data":"5a4113e8006a84520a74694b80780b48a9159ec2ba04b9aa6174205d45e900e7"} Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.107393 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" event={"ID":"736ba007-2c6d-4f91-ae26-16ce53c580c5","Type":"ContainerStarted","Data":"bdc0784e8ac6a8e38cc361b433d0c6167f165dee537a2968d10b45106c2fa62c"} Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.108707 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" event={"ID":"b061854a-f0c6-4754-a947-a7d5408f25db","Type":"ContainerDied","Data":"2cc17bab5a42964e336fc25e4f9d43353486628812f1bad5c2e3d1b82435adbf"} Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.108760 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.110957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k7csj" event={"ID":"88c21489-524e-4ee7-a340-5be2573af161","Type":"ContainerStarted","Data":"c9e1c05611f8961e024087e0e04491e46e765acba8a5cc8a2a36a27876de28c3"} Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.110982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k7csj" event={"ID":"88c21489-524e-4ee7-a340-5be2573af161","Type":"ContainerStarted","Data":"f6763fa902e28879cc4359d1b1acc4ff238f733e4bd8236ae411565bdfb3ac57"} Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.131629 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-k7csj" podStartSLOduration=2.131563135 podStartE2EDuration="2.131563135s" podCreationTimestamp="2026-02-20 10:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:13:04.130498601 +0000 UTC m=+1075.712970447" watchObservedRunningTime="2026-02-20 10:13:04.131563135 +0000 UTC m=+1075.714034981" Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.183946 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.189746 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.335832 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:13:04 crc kubenswrapper[4962]: W0220 10:13:04.349613 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33d73a04_08b2_4944_861f_749a63c2565d.slice/crio-5c36b8026a940c293c08dfca1df88e2b23028519f85595786058a0396a1ade5b WatchSource:0}: Error finding container 5c36b8026a940c293c08dfca1df88e2b23028519f85595786058a0396a1ade5b: Status 404 returned error can't find the container with id 5c36b8026a940c293c08dfca1df88e2b23028519f85595786058a0396a1ade5b Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.738271 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.874877 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.125561 4962 generic.go:334] "Generic (PLEG): container finished" podID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerID="8b1c406d73d48e9a02cd34de0f3b729ea2c65e81565abd3265c085cac257a091" exitCode=0 Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.125747 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" event={"ID":"736ba007-2c6d-4f91-ae26-16ce53c580c5","Type":"ContainerDied","Data":"8b1c406d73d48e9a02cd34de0f3b729ea2c65e81565abd3265c085cac257a091"} Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.128935 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerStarted","Data":"5c36b8026a940c293c08dfca1df88e2b23028519f85595786058a0396a1ade5b"} Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.131608 4962 generic.go:334] "Generic (PLEG): container finished" podID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerID="d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb" exitCode=0 Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.131681 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" event={"ID":"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5","Type":"ContainerDied","Data":"d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb"} Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.175394 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d0cdce-fd47-471a-94af-ee68fed6a2aa" path="/var/lib/kubelet/pods/01d0cdce-fd47-471a-94af-ee68fed6a2aa/volumes" Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.176539 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b061854a-f0c6-4754-a947-a7d5408f25db" path="/var/lib/kubelet/pods/b061854a-f0c6-4754-a947-a7d5408f25db/volumes" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.142776 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerStarted","Data":"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453"} Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.143178 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerStarted","Data":"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe"} Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.143201 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.145347 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" event={"ID":"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5","Type":"ContainerStarted","Data":"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80"} Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.145481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.147894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" event={"ID":"736ba007-2c6d-4f91-ae26-16ce53c580c5","Type":"ContainerStarted","Data":"060fcc52874b30c1462aa08598659a751fb930b9fa898ee5d55f34214c6442bd"} Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.148396 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.178814 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.97981958 podStartE2EDuration="3.178782077s" podCreationTimestamp="2026-02-20 10:13:03 +0000 UTC" firstStartedPulling="2026-02-20 10:13:04.355904721 +0000 UTC m=+1075.938376587" lastFinishedPulling="2026-02-20 10:13:05.554867228 +0000 UTC m=+1077.137339084" observedRunningTime="2026-02-20 10:13:06.174874025 +0000 UTC m=+1077.757345901" watchObservedRunningTime="2026-02-20 10:13:06.178782077 +0000 UTC m=+1077.761253933" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.204161 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" podStartSLOduration=3.247353962 podStartE2EDuration="4.204138097s" podCreationTimestamp="2026-02-20 10:13:02 +0000 UTC" firstStartedPulling="2026-02-20 10:13:03.476801295 +0000 UTC m=+1075.059273141" lastFinishedPulling="2026-02-20 10:13:04.43358541 +0000 UTC m=+1076.016057276" observedRunningTime="2026-02-20 10:13:06.202495275 +0000 UTC m=+1077.784967131" watchObservedRunningTime="2026-02-20 10:13:06.204138097 +0000 UTC m=+1077.786609953" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.228032 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" podStartSLOduration=3.6711040280000002 podStartE2EDuration="4.22800569s" podCreationTimestamp="2026-02-20 10:13:02 +0000 UTC" firstStartedPulling="2026-02-20 10:13:03.978682765 +0000 UTC m=+1075.561154611" lastFinishedPulling="2026-02-20 10:13:04.535584427 +0000 UTC m=+1076.118056273" observedRunningTime="2026-02-20 10:13:06.22255438 +0000 UTC m=+1077.805026236" watchObservedRunningTime="2026-02-20 10:13:06.22800569 +0000 UTC m=+1077.810477536" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.730317 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.730926 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.840234 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-48c8f"] Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.842352 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.847492 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.849210 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-48c8f"] Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.903520 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.051710 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.051875 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p7fr\" (UniqueName: \"kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.155253 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.155540 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p7fr\" (UniqueName: \"kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.158802 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.196445 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p7fr\" (UniqueName: \"kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.338319 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.475539 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:08 crc kubenswrapper[4962]: I0220 10:13:08.048216 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-48c8f"] Feb 20 10:13:08 crc kubenswrapper[4962]: I0220 10:13:08.184695 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48c8f" event={"ID":"cb73a133-7ca1-492e-ac32-fb33d6c335ba","Type":"ContainerStarted","Data":"f7f2e02b137d5913310205dbb052550410f12f1f612581277520789b3dc50ed0"} Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.433359 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-svsfg"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.435986 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.443517 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-svsfg"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.532822 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-125a-account-create-update-bd2q8"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.534032 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.535860 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.536049 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c76wh\" (UniqueName: \"kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.538397 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.544231 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-125a-account-create-update-bd2q8"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.638104 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.638200 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c76wh\" (UniqueName: \"kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.638264 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.638325 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqhft\" (UniqueName: \"kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.639393 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.650334 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-zfmzb"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.651935 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.661740 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zfmzb"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.665681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c76wh\" (UniqueName: \"kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.746808 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqhft\" (UniqueName: \"kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.756067 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.761301 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.761848 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.765281 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d8b3-account-create-update-br2xj"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.767102 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.770814 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.780100 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqhft\" (UniqueName: \"kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.784810 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d8b3-account-create-update-br2xj"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.856966 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.858030 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szvdf\" (UniqueName: \"kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.858193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.960313 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.961331 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48t26\" (UniqueName: \"kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.965050 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.965162 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szvdf\" (UniqueName: \"kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.973928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.989481 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szvdf\" (UniqueName: \"kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.067375 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.067485 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48t26\" (UniqueName: \"kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.068368 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.081689 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.099906 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48t26\" (UniqueName: \"kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.215058 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.237185 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-svsfg"] Feb 20 10:13:10 crc kubenswrapper[4962]: W0220 10:13:10.257185 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c97128d_8360_482e_b05b_6025d046c122.slice/crio-5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36 WatchSource:0}: Error finding container 5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36: Status 404 returned error can't find the container with id 5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36 Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.383931 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-125a-account-create-update-bd2q8"] Feb 20 10:13:10 crc kubenswrapper[4962]: W0220 10:13:10.406093 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3d903f3_8f86_49e2_848b_4a59a9068b75.slice/crio-c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b WatchSource:0}: Error finding container c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b: Status 404 returned error can't find the container with id c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.523747 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.524162 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="dnsmasq-dns" containerID="cri-o://060fcc52874b30c1462aa08598659a751fb930b9fa898ee5d55f34214c6442bd" gracePeriod=10 Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.528722 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.541896 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zfmzb"] Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.557083 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.559026 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.579779 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.607040 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.679714 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.679804 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.679853 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.679872 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.679899 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7hmg\" (UniqueName: \"kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.725991 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d8b3-account-create-update-br2xj"] Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.781476 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.781548 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.781572 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.781614 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7hmg\" (UniqueName: \"kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.781723 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.782719 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.783260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.784477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.784805 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.830554 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7hmg\" (UniqueName: \"kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.899839 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.219161 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zfmzb" event={"ID":"2b915fcc-cf15-43c3-97c6-bde3a29da796","Type":"ContainerStarted","Data":"b6c450bcf92382b603afb108ffce51277656e7c13205fe6efdfb5f56cb4f3fad"} Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.220852 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-125a-account-create-update-bd2q8" event={"ID":"a3d903f3-8f86-49e2-848b-4a59a9068b75","Type":"ContainerStarted","Data":"c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b"} Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.221814 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-svsfg" event={"ID":"9c97128d-8360-482e-b05b-6025d046c122","Type":"ContainerStarted","Data":"5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36"} Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.222624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8b3-account-create-update-br2xj" event={"ID":"7c7420bd-d4ef-4511-acf4-a132ad0a5677","Type":"ContainerStarted","Data":"b2785a54fb58f00cffa05ba7e64b052a132025e8e4e4c971af47565aa7808a85"} Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.372166 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:13:11 crc kubenswrapper[4962]: W0220 10:13:11.381776 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podced7b045_00ec_453d_9a56_b13132991e8c.slice/crio-e9818b1a3f9f3197b15f5f2de8df4aeac94c0b0051e4206684bcec3fc52e8885 WatchSource:0}: Error finding container e9818b1a3f9f3197b15f5f2de8df4aeac94c0b0051e4206684bcec3fc52e8885: Status 404 returned error can't find the container with id e9818b1a3f9f3197b15f5f2de8df4aeac94c0b0051e4206684bcec3fc52e8885 Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.507886 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.507965 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.726899 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.736918 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.756493 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.757103 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.757402 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-fvgjk" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.760288 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.780265 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913251 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913651 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57kkn\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913756 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913805 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913846 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.968935 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9mznb"] Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.971775 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.974311 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.975070 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.975207 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.981603 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9mznb"] Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017119 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.016227 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017392 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017629 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57kkn\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017715 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017765 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017808 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.018157 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.018236 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.018255 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.018316 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift podName:f4fb3b99-0e02-4c5c-9704-884ea3f0605d nodeName:}" failed. No retries permitted until 2026-02-20 10:13:12.518296295 +0000 UTC m=+1084.100768141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift") pod "swift-storage-0" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d") : configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.018338 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.023068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.038343 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57kkn\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.062553 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120267 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120340 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmc5\" (UniqueName: \"kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120472 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120496 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120730 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120767 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223331 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223419 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223538 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223635 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223803 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmc5\" (UniqueName: \"kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223964 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.238347 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48c8f" event={"ID":"cb73a133-7ca1-492e-ac32-fb33d6c335ba","Type":"ContainerStarted","Data":"1b60442fa3cb970cd1e3424fd12f2f5e98e959daa205ca4e27a8e01da1487e66"} Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.240058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" event={"ID":"ced7b045-00ec-453d-9a56-b13132991e8c","Type":"ContainerStarted","Data":"e9818b1a3f9f3197b15f5f2de8df4aeac94c0b0051e4206684bcec3fc52e8885"} Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.254477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.254681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.255252 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.255519 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.256055 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.257789 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.258770 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmc5\" (UniqueName: \"kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.292981 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.531773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.532071 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.532276 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.532349 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift podName:f4fb3b99-0e02-4c5c-9704-884ea3f0605d nodeName:}" failed. No retries permitted until 2026-02-20 10:13:13.532325272 +0000 UTC m=+1085.114797128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift") pod "swift-storage-0" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d") : configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.856567 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9mznb"] Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.860878 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.254310 4962 generic.go:334] "Generic (PLEG): container finished" podID="ced7b045-00ec-453d-9a56-b13132991e8c" containerID="7ec00e9b2989f478e117f8d08060d562a11edc343ce310e24fa477348d6aca1b" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.254378 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" event={"ID":"ced7b045-00ec-453d-9a56-b13132991e8c","Type":"ContainerDied","Data":"7ec00e9b2989f478e117f8d08060d562a11edc343ce310e24fa477348d6aca1b"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.258306 4962 generic.go:334] "Generic (PLEG): container finished" podID="9c97128d-8360-482e-b05b-6025d046c122" containerID="8e1e57cd49c915d1862d936053074b6280af762ac9dd3bf4c1c80c561fca009f" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.258389 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-svsfg" event={"ID":"9c97128d-8360-482e-b05b-6025d046c122","Type":"ContainerDied","Data":"8e1e57cd49c915d1862d936053074b6280af762ac9dd3bf4c1c80c561fca009f"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.265483 4962 generic.go:334] "Generic (PLEG): container finished" podID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerID="060fcc52874b30c1462aa08598659a751fb930b9fa898ee5d55f34214c6442bd" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.265596 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" event={"ID":"736ba007-2c6d-4f91-ae26-16ce53c580c5","Type":"ContainerDied","Data":"060fcc52874b30c1462aa08598659a751fb930b9fa898ee5d55f34214c6442bd"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.271649 4962 generic.go:334] "Generic (PLEG): container finished" podID="7c7420bd-d4ef-4511-acf4-a132ad0a5677" containerID="109a3b4f30138b426060ee3960875f54b8e50460794fa326f4252e9233232cac" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.271740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8b3-account-create-update-br2xj" event={"ID":"7c7420bd-d4ef-4511-acf4-a132ad0a5677","Type":"ContainerDied","Data":"109a3b4f30138b426060ee3960875f54b8e50460794fa326f4252e9233232cac"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.278751 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9mznb" event={"ID":"2e7338a7-4012-439d-b961-6ca0c55dd6e6","Type":"ContainerStarted","Data":"8c010bf1bf294a17a678e18297fc5c3ac174eb1389754dd15038dc4e26f7804b"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.291316 4962 generic.go:334] "Generic (PLEG): container finished" podID="cb73a133-7ca1-492e-ac32-fb33d6c335ba" containerID="1b60442fa3cb970cd1e3424fd12f2f5e98e959daa205ca4e27a8e01da1487e66" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.291401 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48c8f" event={"ID":"cb73a133-7ca1-492e-ac32-fb33d6c335ba","Type":"ContainerDied","Data":"1b60442fa3cb970cd1e3424fd12f2f5e98e959daa205ca4e27a8e01da1487e66"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.297031 4962 generic.go:334] "Generic (PLEG): container finished" podID="2b915fcc-cf15-43c3-97c6-bde3a29da796" containerID="a637cdafdb841809ec5f95151c668e1d8c78d29aabd8a60383f137a82dcb2009" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.297116 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zfmzb" event={"ID":"2b915fcc-cf15-43c3-97c6-bde3a29da796","Type":"ContainerDied","Data":"a637cdafdb841809ec5f95151c668e1d8c78d29aabd8a60383f137a82dcb2009"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.304114 4962 generic.go:334] "Generic (PLEG): container finished" podID="a3d903f3-8f86-49e2-848b-4a59a9068b75" containerID="1dbe5f8319feef22f1ef43626823510cfe8e71d6a8d49cafca70087ce33b1b60" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.304211 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-125a-account-create-update-bd2q8" event={"ID":"a3d903f3-8f86-49e2-848b-4a59a9068b75","Type":"ContainerDied","Data":"1dbe5f8319feef22f1ef43626823510cfe8e71d6a8d49cafca70087ce33b1b60"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.403816 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.555479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:13 crc kubenswrapper[4962]: E0220 10:13:13.556301 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 10:13:13 crc kubenswrapper[4962]: E0220 10:13:13.556367 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 10:13:13 crc kubenswrapper[4962]: E0220 10:13:13.556454 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift podName:f4fb3b99-0e02-4c5c-9704-884ea3f0605d nodeName:}" failed. No retries permitted until 2026-02-20 10:13:15.556417583 +0000 UTC m=+1087.138889439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift") pod "swift-storage-0" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d") : configmap "swift-ring-files" not found Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.659811 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2m8r7"] Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.662639 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.668052 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2m8r7"] Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.765642 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9559\" (UniqueName: \"kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.765719 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.784918 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7588-account-create-update-6ttfz"] Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.786256 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.788885 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.791405 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.795894 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7588-account-create-update-6ttfz"] Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.868323 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9559\" (UniqueName: \"kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.868404 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.869453 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.902419 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9559\" (UniqueName: \"kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.971897 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") pod \"736ba007-2c6d-4f91-ae26-16ce53c580c5\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.972525 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc\") pod \"736ba007-2c6d-4f91-ae26-16ce53c580c5\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.972667 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config\") pod \"736ba007-2c6d-4f91-ae26-16ce53c580c5\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.972865 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn4cm\" (UniqueName: \"kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm\") pod \"736ba007-2c6d-4f91-ae26-16ce53c580c5\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.973655 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c2xf\" (UniqueName: \"kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.973750 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.977881 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm" (OuterVolumeSpecName: "kube-api-access-cn4cm") pod "736ba007-2c6d-4f91-ae26-16ce53c580c5" (UID: "736ba007-2c6d-4f91-ae26-16ce53c580c5"). InnerVolumeSpecName "kube-api-access-cn4cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.008059 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.016793 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config" (OuterVolumeSpecName: "config") pod "736ba007-2c6d-4f91-ae26-16ce53c580c5" (UID: "736ba007-2c6d-4f91-ae26-16ce53c580c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.017283 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "736ba007-2c6d-4f91-ae26-16ce53c580c5" (UID: "736ba007-2c6d-4f91-ae26-16ce53c580c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:14 crc kubenswrapper[4962]: E0220 10:13:14.017436 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb podName:736ba007-2c6d-4f91-ae26-16ce53c580c5 nodeName:}" failed. No retries permitted until 2026-02-20 10:13:14.517405519 +0000 UTC m=+1086.099877365 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb") pod "736ba007-2c6d-4f91-ae26-16ce53c580c5" (UID: "736ba007-2c6d-4f91-ae26-16ce53c580c5") : error deleting /var/lib/kubelet/pods/736ba007-2c6d-4f91-ae26-16ce53c580c5/volume-subpaths: remove /var/lib/kubelet/pods/736ba007-2c6d-4f91-ae26-16ce53c580c5/volume-subpaths: no such file or directory Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.075832 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c2xf\" (UniqueName: \"kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.076070 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.076327 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.076342 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn4cm\" (UniqueName: \"kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.076356 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.078167 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.099965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c2xf\" (UniqueName: \"kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.104918 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.324572 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" event={"ID":"ced7b045-00ec-453d-9a56-b13132991e8c","Type":"ContainerStarted","Data":"025a5380bda7921dcbb477c483405adc20d83c027f38ca77224085c6cba7f4f3"} Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.326516 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.339809 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.340437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" event={"ID":"736ba007-2c6d-4f91-ae26-16ce53c580c5","Type":"ContainerDied","Data":"bdc0784e8ac6a8e38cc361b433d0c6167f165dee537a2968d10b45106c2fa62c"} Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.340491 4962 scope.go:117] "RemoveContainer" containerID="060fcc52874b30c1462aa08598659a751fb930b9fa898ee5d55f34214c6442bd" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.374914 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" podStartSLOduration=4.374880611 podStartE2EDuration="4.374880611s" podCreationTimestamp="2026-02-20 10:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:13:14.34629082 +0000 UTC m=+1085.928762666" watchObservedRunningTime="2026-02-20 10:13:14.374880611 +0000 UTC m=+1085.957352457" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.397476 4962 scope.go:117] "RemoveContainer" containerID="8b1c406d73d48e9a02cd34de0f3b729ea2c65e81565abd3265c085cac257a091" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.513517 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2m8r7"] Feb 20 10:13:14 crc kubenswrapper[4962]: W0220 10:13:14.527275 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0275d40a_1206_4eb2_96c8_6c516c57bed7.slice/crio-49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765 WatchSource:0}: Error finding container 49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765: Status 404 returned error can't find the container with id 49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765 Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.589861 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") pod \"736ba007-2c6d-4f91-ae26-16ce53c580c5\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.591038 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7588-account-create-update-6ttfz"] Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.591266 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "736ba007-2c6d-4f91-ae26-16ce53c580c5" (UID: "736ba007-2c6d-4f91-ae26-16ce53c580c5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.703087 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.712313 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.722971 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.837503 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.918571 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts\") pod \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.918934 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p7fr\" (UniqueName: \"kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr\") pod \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.920940 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb73a133-7ca1-492e-ac32-fb33d6c335ba" (UID: "cb73a133-7ca1-492e-ac32-fb33d6c335ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.933570 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr" (OuterVolumeSpecName: "kube-api-access-6p7fr") pod "cb73a133-7ca1-492e-ac32-fb33d6c335ba" (UID: "cb73a133-7ca1-492e-ac32-fb33d6c335ba"). InnerVolumeSpecName "kube-api-access-6p7fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.021108 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p7fr\" (UniqueName: \"kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.021144 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.153090 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" path="/var/lib/kubelet/pods/736ba007-2c6d-4f91-ae26-16ce53c580c5/volumes" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.401827 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7588-account-create-update-6ttfz" event={"ID":"afbf9dd3-3bb5-4908-aad0-d06f09946e17","Type":"ContainerStarted","Data":"801e7d436fb1225d117310066525072e6dde83be410ac37d92ea2f315f19006b"} Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.404180 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48c8f" event={"ID":"cb73a133-7ca1-492e-ac32-fb33d6c335ba","Type":"ContainerDied","Data":"f7f2e02b137d5913310205dbb052550410f12f1f612581277520789b3dc50ed0"} Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.404379 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7f2e02b137d5913310205dbb052550410f12f1f612581277520789b3dc50ed0" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.404425 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.406331 4962 generic.go:334] "Generic (PLEG): container finished" podID="0275d40a-1206-4eb2-96c8-6c516c57bed7" containerID="1933a4410cc57079acebbf3cca845c0c1a3c75df94daefc5b4a3cc61d913faab" exitCode=0 Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.406513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2m8r7" event={"ID":"0275d40a-1206-4eb2-96c8-6c516c57bed7","Type":"ContainerDied","Data":"1933a4410cc57079acebbf3cca845c0c1a3c75df94daefc5b4a3cc61d913faab"} Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.407497 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2m8r7" event={"ID":"0275d40a-1206-4eb2-96c8-6c516c57bed7","Type":"ContainerStarted","Data":"49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765"} Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.635033 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:15 crc kubenswrapper[4962]: E0220 10:13:15.635436 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 10:13:15 crc kubenswrapper[4962]: E0220 10:13:15.635462 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 10:13:15 crc kubenswrapper[4962]: E0220 10:13:15.635532 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift podName:f4fb3b99-0e02-4c5c-9704-884ea3f0605d nodeName:}" failed. No retries permitted until 2026-02-20 10:13:19.635512688 +0000 UTC m=+1091.217984534 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift") pod "swift-storage-0" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d") : configmap "swift-ring-files" not found Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.447501 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-125a-account-create-update-bd2q8" event={"ID":"a3d903f3-8f86-49e2-848b-4a59a9068b75","Type":"ContainerDied","Data":"c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b"} Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.450493 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.453069 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2m8r7" event={"ID":"0275d40a-1206-4eb2-96c8-6c516c57bed7","Type":"ContainerDied","Data":"49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765"} Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.453119 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.455230 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-svsfg" event={"ID":"9c97128d-8360-482e-b05b-6025d046c122","Type":"ContainerDied","Data":"5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36"} Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.455280 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.458076 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8b3-account-create-update-br2xj" event={"ID":"7c7420bd-d4ef-4511-acf4-a132ad0a5677","Type":"ContainerDied","Data":"b2785a54fb58f00cffa05ba7e64b052a132025e8e4e4c971af47565aa7808a85"} Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.458115 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2785a54fb58f00cffa05ba7e64b052a132025e8e4e4c971af47565aa7808a85" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.464486 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zfmzb" event={"ID":"2b915fcc-cf15-43c3-97c6-bde3a29da796","Type":"ContainerDied","Data":"b6c450bcf92382b603afb108ffce51277656e7c13205fe6efdfb5f56cb4f3fad"} Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.464525 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6c450bcf92382b603afb108ffce51277656e7c13205fe6efdfb5f56cb4f3fad" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.507087 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.554709 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.557536 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.593862 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.603200 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605600 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c76wh\" (UniqueName: \"kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh\") pod \"9c97128d-8360-482e-b05b-6025d046c122\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605675 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts\") pod \"a3d903f3-8f86-49e2-848b-4a59a9068b75\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605780 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts\") pod \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605816 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48t26\" (UniqueName: \"kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26\") pod \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605888 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts\") pod \"9c97128d-8360-482e-b05b-6025d046c122\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605945 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqhft\" (UniqueName: \"kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft\") pod \"a3d903f3-8f86-49e2-848b-4a59a9068b75\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.608610 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c7420bd-d4ef-4511-acf4-a132ad0a5677" (UID: "7c7420bd-d4ef-4511-acf4-a132ad0a5677"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.609533 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c97128d-8360-482e-b05b-6025d046c122" (UID: "9c97128d-8360-482e-b05b-6025d046c122"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.609847 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3d903f3-8f86-49e2-848b-4a59a9068b75" (UID: "a3d903f3-8f86-49e2-848b-4a59a9068b75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.616340 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh" (OuterVolumeSpecName: "kube-api-access-c76wh") pod "9c97128d-8360-482e-b05b-6025d046c122" (UID: "9c97128d-8360-482e-b05b-6025d046c122"). InnerVolumeSpecName "kube-api-access-c76wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.622091 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26" (OuterVolumeSpecName: "kube-api-access-48t26") pod "7c7420bd-d4ef-4511-acf4-a132ad0a5677" (UID: "7c7420bd-d4ef-4511-acf4-a132ad0a5677"). InnerVolumeSpecName "kube-api-access-48t26". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.622145 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft" (OuterVolumeSpecName: "kube-api-access-rqhft") pod "a3d903f3-8f86-49e2-848b-4a59a9068b75" (UID: "a3d903f3-8f86-49e2-848b-4a59a9068b75"). InnerVolumeSpecName "kube-api-access-rqhft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.707859 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9559\" (UniqueName: \"kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559\") pod \"0275d40a-1206-4eb2-96c8-6c516c57bed7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708051 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts\") pod \"0275d40a-1206-4eb2-96c8-6c516c57bed7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708094 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szvdf\" (UniqueName: \"kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf\") pod \"2b915fcc-cf15-43c3-97c6-bde3a29da796\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708241 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts\") pod \"2b915fcc-cf15-43c3-97c6-bde3a29da796\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708862 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqhft\" (UniqueName: \"kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708881 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c76wh\" (UniqueName: \"kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708895 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708908 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708921 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48t26\" (UniqueName: \"kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708931 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.709413 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b915fcc-cf15-43c3-97c6-bde3a29da796" (UID: "2b915fcc-cf15-43c3-97c6-bde3a29da796"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.709567 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0275d40a-1206-4eb2-96c8-6c516c57bed7" (UID: "0275d40a-1206-4eb2-96c8-6c516c57bed7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.711808 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf" (OuterVolumeSpecName: "kube-api-access-szvdf") pod "2b915fcc-cf15-43c3-97c6-bde3a29da796" (UID: "2b915fcc-cf15-43c3-97c6-bde3a29da796"). InnerVolumeSpecName "kube-api-access-szvdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.714745 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559" (OuterVolumeSpecName: "kube-api-access-t9559") pod "0275d40a-1206-4eb2-96c8-6c516c57bed7" (UID: "0275d40a-1206-4eb2-96c8-6c516c57bed7"). InnerVolumeSpecName "kube-api-access-t9559". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.810380 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.810423 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9559\" (UniqueName: \"kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.810442 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.810455 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szvdf\" (UniqueName: \"kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.476051 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9mznb" event={"ID":"2e7338a7-4012-439d-b961-6ca0c55dd6e6","Type":"ContainerStarted","Data":"1255947ebb2d1ff7325c767c453081290e37fc7eec685e64c813cb21e269d2c8"} Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.478888 4962 generic.go:334] "Generic (PLEG): container finished" podID="afbf9dd3-3bb5-4908-aad0-d06f09946e17" containerID="0be6bfc0db94e6c57e1c0a4856d3600b1ea4d12d42a32685b52156cacc1224a0" exitCode=0 Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479046 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479061 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479108 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7588-account-create-update-6ttfz" event={"ID":"afbf9dd3-3bb5-4908-aad0-d06f09946e17","Type":"ContainerDied","Data":"0be6bfc0db94e6c57e1c0a4856d3600b1ea4d12d42a32685b52156cacc1224a0"} Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479316 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479446 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479561 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.505040 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-9mznb" podStartSLOduration=3.172032485 podStartE2EDuration="7.505018547s" podCreationTimestamp="2026-02-20 10:13:11 +0000 UTC" firstStartedPulling="2026-02-20 10:13:12.903593124 +0000 UTC m=+1084.486064970" lastFinishedPulling="2026-02-20 10:13:17.236579156 +0000 UTC m=+1088.819051032" observedRunningTime="2026-02-20 10:13:18.499412302 +0000 UTC m=+1090.081884158" watchObservedRunningTime="2026-02-20 10:13:18.505018547 +0000 UTC m=+1090.087490393" Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.659821 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:19 crc kubenswrapper[4962]: E0220 10:13:19.660160 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 10:13:19 crc kubenswrapper[4962]: E0220 10:13:19.660792 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 10:13:19 crc kubenswrapper[4962]: E0220 10:13:19.660890 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift podName:f4fb3b99-0e02-4c5c-9704-884ea3f0605d nodeName:}" failed. No retries permitted until 2026-02-20 10:13:27.66085765 +0000 UTC m=+1099.243329496 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift") pod "swift-storage-0" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d") : configmap "swift-ring-files" not found Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.878024 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.966586 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts\") pod \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.966655 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c2xf\" (UniqueName: \"kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf\") pod \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.967983 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afbf9dd3-3bb5-4908-aad0-d06f09946e17" (UID: "afbf9dd3-3bb5-4908-aad0-d06f09946e17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.976302 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf" (OuterVolumeSpecName: "kube-api-access-7c2xf") pod "afbf9dd3-3bb5-4908-aad0-d06f09946e17" (UID: "afbf9dd3-3bb5-4908-aad0-d06f09946e17"). InnerVolumeSpecName "kube-api-access-7c2xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.069264 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.069310 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c2xf\" (UniqueName: \"kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.414731 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-48c8f"] Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.423691 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-48c8f"] Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.500328 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7588-account-create-update-6ttfz" event={"ID":"afbf9dd3-3bb5-4908-aad0-d06f09946e17","Type":"ContainerDied","Data":"801e7d436fb1225d117310066525072e6dde83be410ac37d92ea2f315f19006b"} Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.500386 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="801e7d436fb1225d117310066525072e6dde83be410ac37d92ea2f315f19006b" Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.500396 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.903035 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.002949 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.003266 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="dnsmasq-dns" containerID="cri-o://5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80" gracePeriod=10 Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.154684 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb73a133-7ca1-492e-ac32-fb33d6c335ba" path="/var/lib/kubelet/pods/cb73a133-7ca1-492e-ac32-fb33d6c335ba/volumes" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.505219 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.511209 4962 generic.go:334] "Generic (PLEG): container finished" podID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerID="5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80" exitCode=0 Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.511259 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" event={"ID":"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5","Type":"ContainerDied","Data":"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80"} Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.511290 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" event={"ID":"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5","Type":"ContainerDied","Data":"5a4113e8006a84520a74694b80780b48a9159ec2ba04b9aa6174205d45e900e7"} Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.511310 4962 scope.go:117] "RemoveContainer" containerID="5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.511471 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.546780 4962 scope.go:117] "RemoveContainer" containerID="d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.566413 4962 scope.go:117] "RemoveContainer" containerID="5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80" Feb 20 10:13:21 crc kubenswrapper[4962]: E0220 10:13:21.567024 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80\": container with ID starting with 5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80 not found: ID does not exist" containerID="5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.567083 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80"} err="failed to get container status \"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80\": rpc error: code = NotFound desc = could not find container \"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80\": container with ID starting with 5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80 not found: ID does not exist" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.567115 4962 scope.go:117] "RemoveContainer" containerID="d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb" Feb 20 10:13:21 crc kubenswrapper[4962]: E0220 10:13:21.567698 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb\": container with ID starting with d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb not found: ID does not exist" containerID="d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.567751 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb"} err="failed to get container status \"d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb\": rpc error: code = NotFound desc = could not find container \"d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb\": container with ID starting with d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb not found: ID does not exist" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.600748 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxq9l\" (UniqueName: \"kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l\") pod \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.600850 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc\") pod \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.600896 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config\") pod \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.600927 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb\") pod \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.600953 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb\") pod \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.607172 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l" (OuterVolumeSpecName: "kube-api-access-fxq9l") pod "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" (UID: "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5"). InnerVolumeSpecName "kube-api-access-fxq9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.643125 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" (UID: "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.653845 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config" (OuterVolumeSpecName: "config") pod "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" (UID: "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.657075 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" (UID: "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.661526 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" (UID: "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.703568 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxq9l\" (UniqueName: \"kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.703638 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.703651 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.703661 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.703671 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.865021 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.877178 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.157101 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" path="/var/lib/kubelet/pods/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5/volumes" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.952694 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9gcrq"] Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953405 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb73a133-7ca1-492e-ac32-fb33d6c335ba" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953445 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb73a133-7ca1-492e-ac32-fb33d6c335ba" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953463 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d903f3-8f86-49e2-848b-4a59a9068b75" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953472 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d903f3-8f86-49e2-848b-4a59a9068b75" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953486 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953495 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953526 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7420bd-d4ef-4511-acf4-a132ad0a5677" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953537 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7420bd-d4ef-4511-acf4-a132ad0a5677" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953552 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c97128d-8360-482e-b05b-6025d046c122" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953560 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c97128d-8360-482e-b05b-6025d046c122" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953577 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbf9dd3-3bb5-4908-aad0-d06f09946e17" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953586 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbf9dd3-3bb5-4908-aad0-d06f09946e17" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953614 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b915fcc-cf15-43c3-97c6-bde3a29da796" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953623 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b915fcc-cf15-43c3-97c6-bde3a29da796" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953641 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="init" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953650 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="init" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953674 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="init" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953683 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="init" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953705 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0275d40a-1206-4eb2-96c8-6c516c57bed7" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953714 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0275d40a-1206-4eb2-96c8-6c516c57bed7" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953738 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953747 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953970 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0275d40a-1206-4eb2-96c8-6c516c57bed7" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953989 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7420bd-d4ef-4511-acf4-a132ad0a5677" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954004 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b915fcc-cf15-43c3-97c6-bde3a29da796" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954015 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="afbf9dd3-3bb5-4908-aad0-d06f09946e17" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954046 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d903f3-8f86-49e2-848b-4a59a9068b75" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954067 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954079 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb73a133-7ca1-492e-ac32-fb33d6c335ba" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954091 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c97128d-8360-482e-b05b-6025d046c122" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954104 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954891 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.961212 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.961231 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.963504 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lzhn7" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.970841 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9gcrq"] Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.052306 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.052458 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.052790 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqljl\" (UniqueName: \"kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.052944 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.154645 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.154715 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.154802 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqljl\" (UniqueName: \"kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.154863 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.162456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.162829 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.169135 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.179245 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqljl\" (UniqueName: \"kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.275033 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.549354 4962 generic.go:334] "Generic (PLEG): container finished" podID="2e7338a7-4012-439d-b961-6ca0c55dd6e6" containerID="1255947ebb2d1ff7325c767c453081290e37fc7eec685e64c813cb21e269d2c8" exitCode=0 Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.549529 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9mznb" event={"ID":"2e7338a7-4012-439d-b961-6ca0c55dd6e6","Type":"ContainerDied","Data":"1255947ebb2d1ff7325c767c453081290e37fc7eec685e64c813cb21e269d2c8"} Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.679839 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9gcrq"] Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.399028 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ptczd"] Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.400397 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.404672 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.418146 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ptczd"] Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.498814 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vsmq\" (UniqueName: \"kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.499217 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.563198 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9gcrq" event={"ID":"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1","Type":"ContainerStarted","Data":"a8e14a05cffa52a7e3ad38a2be0cb8a03501b42d139fbf401ec8ecee6a2bd2a6"} Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.601472 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.601681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vsmq\" (UniqueName: \"kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.602432 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.628102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vsmq\" (UniqueName: \"kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.771959 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.904885 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007272 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnmc5\" (UniqueName: \"kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007336 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007402 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007429 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007498 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007542 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007630 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.009664 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.009674 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.027482 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.036475 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5" (OuterVolumeSpecName: "kube-api-access-dnmc5") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "kube-api-access-dnmc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.036568 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.048786 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.048920 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts" (OuterVolumeSpecName: "scripts") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110115 4962 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110155 4962 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110167 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110180 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnmc5\" (UniqueName: \"kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110192 4962 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110200 4962 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110210 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.245671 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ptczd"] Feb 20 10:13:26 crc kubenswrapper[4962]: W0220 10:13:26.254431 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod598e051e_58af_4a1a_aa46_7f88d635f34c.slice/crio-3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610 WatchSource:0}: Error finding container 3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610: Status 404 returned error can't find the container with id 3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610 Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.593442 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9mznb" event={"ID":"2e7338a7-4012-439d-b961-6ca0c55dd6e6","Type":"ContainerDied","Data":"8c010bf1bf294a17a678e18297fc5c3ac174eb1389754dd15038dc4e26f7804b"} Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.593497 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c010bf1bf294a17a678e18297fc5c3ac174eb1389754dd15038dc4e26f7804b" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.593511 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.598391 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ptczd" event={"ID":"598e051e-58af-4a1a-aa46-7f88d635f34c","Type":"ContainerStarted","Data":"1374063c1227f074525aeab9310be0405d817c53450d0331b45011e3f7fb82f7"} Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.598860 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ptczd" event={"ID":"598e051e-58af-4a1a-aa46-7f88d635f34c","Type":"ContainerStarted","Data":"3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610"} Feb 20 10:13:27 crc kubenswrapper[4962]: I0220 10:13:27.609729 4962 generic.go:334] "Generic (PLEG): container finished" podID="598e051e-58af-4a1a-aa46-7f88d635f34c" containerID="1374063c1227f074525aeab9310be0405d817c53450d0331b45011e3f7fb82f7" exitCode=0 Feb 20 10:13:27 crc kubenswrapper[4962]: I0220 10:13:27.609788 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ptczd" event={"ID":"598e051e-58af-4a1a-aa46-7f88d635f34c","Type":"ContainerDied","Data":"1374063c1227f074525aeab9310be0405d817c53450d0331b45011e3f7fb82f7"} Feb 20 10:13:27 crc kubenswrapper[4962]: I0220 10:13:27.745004 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:27 crc kubenswrapper[4962]: I0220 10:13:27.752231 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:28 crc kubenswrapper[4962]: I0220 10:13:28.004037 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 10:13:28 crc kubenswrapper[4962]: I0220 10:13:28.579727 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:13:28 crc kubenswrapper[4962]: W0220 10:13:28.586982 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4fb3b99_0e02_4c5c_9704_884ea3f0605d.slice/crio-7ba9f2cadbb43f65e2484ea2a7184348cefa8eeb550b59455e6840526a2111e5 WatchSource:0}: Error finding container 7ba9f2cadbb43f65e2484ea2a7184348cefa8eeb550b59455e6840526a2111e5: Status 404 returned error can't find the container with id 7ba9f2cadbb43f65e2484ea2a7184348cefa8eeb550b59455e6840526a2111e5 Feb 20 10:13:28 crc kubenswrapper[4962]: I0220 10:13:28.619215 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"7ba9f2cadbb43f65e2484ea2a7184348cefa8eeb550b59455e6840526a2111e5"} Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.026898 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.195112 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts\") pod \"598e051e-58af-4a1a-aa46-7f88d635f34c\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.195255 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vsmq\" (UniqueName: \"kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq\") pod \"598e051e-58af-4a1a-aa46-7f88d635f34c\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.197526 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "598e051e-58af-4a1a-aa46-7f88d635f34c" (UID: "598e051e-58af-4a1a-aa46-7f88d635f34c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.213787 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq" (OuterVolumeSpecName: "kube-api-access-6vsmq") pod "598e051e-58af-4a1a-aa46-7f88d635f34c" (UID: "598e051e-58af-4a1a-aa46-7f88d635f34c"). InnerVolumeSpecName "kube-api-access-6vsmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.233289 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wj9f6" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" probeResult="failure" output=< Feb 20 10:13:29 crc kubenswrapper[4962]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 20 10:13:29 crc kubenswrapper[4962]: > Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.297674 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.297709 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vsmq\" (UniqueName: \"kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.479850 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.482240 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.627674 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.627678 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ptczd" event={"ID":"598e051e-58af-4a1a-aa46-7f88d635f34c","Type":"ContainerDied","Data":"3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610"} Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.627737 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.722644 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wj9f6-config-grwch"] Feb 20 10:13:29 crc kubenswrapper[4962]: E0220 10:13:29.723025 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e7338a7-4012-439d-b961-6ca0c55dd6e6" containerName="swift-ring-rebalance" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.723047 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e7338a7-4012-439d-b961-6ca0c55dd6e6" containerName="swift-ring-rebalance" Feb 20 10:13:29 crc kubenswrapper[4962]: E0220 10:13:29.723076 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598e051e-58af-4a1a-aa46-7f88d635f34c" containerName="mariadb-account-create-update" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.723087 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="598e051e-58af-4a1a-aa46-7f88d635f34c" containerName="mariadb-account-create-update" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.723308 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="598e051e-58af-4a1a-aa46-7f88d635f34c" containerName="mariadb-account-create-update" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.723332 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e7338a7-4012-439d-b961-6ca0c55dd6e6" containerName="swift-ring-rebalance" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.724726 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.727751 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.733666 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wj9f6-config-grwch"] Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814070 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814142 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814371 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814504 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzdf9\" (UniqueName: \"kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814735 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814811 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917027 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917091 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzdf9\" (UniqueName: \"kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917179 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917206 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917274 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917473 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917487 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917535 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.918389 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.919667 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.935748 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzdf9\" (UniqueName: \"kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:30 crc kubenswrapper[4962]: I0220 10:13:30.053454 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:30 crc kubenswrapper[4962]: I0220 10:13:30.575964 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wj9f6-config-grwch"] Feb 20 10:13:30 crc kubenswrapper[4962]: I0220 10:13:30.639659 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"066dce8eb5ee2a5ee4696fbdc5642875edc121ec4465ea32468ecf8aba5fbe36"} Feb 20 10:13:30 crc kubenswrapper[4962]: I0220 10:13:30.639721 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"138e05b5e05f4d5ae28d62c69c931e5b6907fd9792450f37e652add9de1e83a1"} Feb 20 10:13:30 crc kubenswrapper[4962]: I0220 10:13:30.640924 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6-config-grwch" event={"ID":"b4733c39-1a37-4a56-a731-88fcac6da1c0","Type":"ContainerStarted","Data":"2369792fd5085de62a1c8ae711d4a2040dcad9e3ce1de30b07680e2c16a856e9"} Feb 20 10:13:31 crc kubenswrapper[4962]: I0220 10:13:31.652510 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4733c39-1a37-4a56-a731-88fcac6da1c0" containerID="a4453e5e140badbab6aa97996c8ab339f8ab22881b41594395bdb84a3005b466" exitCode=0 Feb 20 10:13:31 crc kubenswrapper[4962]: I0220 10:13:31.652641 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6-config-grwch" event={"ID":"b4733c39-1a37-4a56-a731-88fcac6da1c0","Type":"ContainerDied","Data":"a4453e5e140badbab6aa97996c8ab339f8ab22881b41594395bdb84a3005b466"} Feb 20 10:13:31 crc kubenswrapper[4962]: I0220 10:13:31.661674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"1b0e56a8482d960b0917a1f3004c6a015099a8313a0f5c4fbb4d166f9d4ea11c"} Feb 20 10:13:31 crc kubenswrapper[4962]: I0220 10:13:31.661726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"5d9d68ccd50ca26ce3191d56dc735011eb169a68e6eedc3144c97564be0ff601"} Feb 20 10:13:33 crc kubenswrapper[4962]: I0220 10:13:33.686047 4962 generic.go:334] "Generic (PLEG): container finished" podID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerID="1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e" exitCode=0 Feb 20 10:13:33 crc kubenswrapper[4962]: I0220 10:13:33.686135 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerDied","Data":"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e"} Feb 20 10:13:34 crc kubenswrapper[4962]: I0220 10:13:34.211815 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wj9f6" Feb 20 10:13:34 crc kubenswrapper[4962]: I0220 10:13:34.705511 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerDied","Data":"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857"} Feb 20 10:13:34 crc kubenswrapper[4962]: I0220 10:13:34.705252 4962 generic.go:334] "Generic (PLEG): container finished" podID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerID="565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857" exitCode=0 Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.260540 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.329928 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.329991 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.330031 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.330058 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.330200 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.330358 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzdf9\" (UniqueName: \"kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.331425 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.331533 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run" (OuterVolumeSpecName: "var-run") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.331653 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.332192 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts" (OuterVolumeSpecName: "scripts") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.332388 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.336695 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.337858 4962 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.337966 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.338031 4962 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.338101 4962 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.348782 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9" (OuterVolumeSpecName: "kube-api-access-hzdf9") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "kube-api-access-hzdf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.441208 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzdf9\" (UniqueName: \"kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.793635 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerStarted","Data":"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308"} Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.794849 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.810155 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6-config-grwch" event={"ID":"b4733c39-1a37-4a56-a731-88fcac6da1c0","Type":"ContainerDied","Data":"2369792fd5085de62a1c8ae711d4a2040dcad9e3ce1de30b07680e2c16a856e9"} Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.810202 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2369792fd5085de62a1c8ae711d4a2040dcad9e3ce1de30b07680e2c16a856e9" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.810290 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.824345 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerStarted","Data":"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718"} Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.824673 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.842506 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.344408399 podStartE2EDuration="1m16.842486749s" podCreationTimestamp="2026-02-20 10:12:23 +0000 UTC" firstStartedPulling="2026-02-20 10:12:25.092635469 +0000 UTC m=+1036.675107315" lastFinishedPulling="2026-02-20 10:12:59.590713809 +0000 UTC m=+1071.173185665" observedRunningTime="2026-02-20 10:13:39.838736302 +0000 UTC m=+1111.421208148" watchObservedRunningTime="2026-02-20 10:13:39.842486749 +0000 UTC m=+1111.424958595" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.905582 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.281711095 podStartE2EDuration="1m16.905556891s" podCreationTimestamp="2026-02-20 10:12:23 +0000 UTC" firstStartedPulling="2026-02-20 10:12:25.969312919 +0000 UTC m=+1037.551784765" lastFinishedPulling="2026-02-20 10:12:59.593158705 +0000 UTC m=+1071.175630561" observedRunningTime="2026-02-20 10:13:39.89974156 +0000 UTC m=+1111.482213406" watchObservedRunningTime="2026-02-20 10:13:39.905556891 +0000 UTC m=+1111.488028737" Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.402134 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wj9f6-config-grwch"] Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.411560 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wj9f6-config-grwch"] Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.840666 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"4bc06842128d6fdcb6b37354d4c5aad1c3642acbd05e513b28a95e6f19bab1ca"} Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.841221 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"8395eb871539c46360c6d66fb96850aeed91819306e7873acf83b98b89a956d8"} Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.841240 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"6727a65f145335bf540a7898aeabecb549d8d22b6c9a1c79a91620a5e8e3e3f8"} Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.842861 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9gcrq" event={"ID":"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1","Type":"ContainerStarted","Data":"ae355b88f320e93105b216772d0d1821b9792d4ee89d86649fd430b7ae19d59e"} Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.866423 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9gcrq" podStartSLOduration=3.385041449 podStartE2EDuration="17.866395147s" podCreationTimestamp="2026-02-20 10:13:23 +0000 UTC" firstStartedPulling="2026-02-20 10:13:24.692980575 +0000 UTC m=+1096.275452411" lastFinishedPulling="2026-02-20 10:13:39.174334253 +0000 UTC m=+1110.756806109" observedRunningTime="2026-02-20 10:13:40.863978762 +0000 UTC m=+1112.446450608" watchObservedRunningTime="2026-02-20 10:13:40.866395147 +0000 UTC m=+1112.448866993" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.153829 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4733c39-1a37-4a56-a731-88fcac6da1c0" path="/var/lib/kubelet/pods/b4733c39-1a37-4a56-a731-88fcac6da1c0/volumes" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.507903 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.507991 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.508050 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.509025 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.509089 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1" gracePeriod=600 Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.855835 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1" exitCode=0 Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.856322 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1"} Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.856379 4962 scope.go:117] "RemoveContainer" containerID="00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.863470 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"87c786369d8da7650fca3be3c67f9a8decb0d8fd88429ab357e31f9e7c19f3e0"} Feb 20 10:13:42 crc kubenswrapper[4962]: I0220 10:13:42.874704 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716"} Feb 20 10:13:43 crc kubenswrapper[4962]: I0220 10:13:43.891807 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"05aae6f36e27022f7b4fa526f1265b47aeb3c166ab95c682c5b8f4ac82205eff"} Feb 20 10:13:43 crc kubenswrapper[4962]: I0220 10:13:43.892388 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"b6ead0e1bdda64a7399139dd6191cc696b570349bf204a2ab46ce0d182cc49a9"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.907809 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"63c4d35ae203bd5ac342fa6d490352730d135f847a680bbe15aae0fe53059141"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.908746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"3c297c5e3426f0b38076ba12a36de8e42599c1ec9b371d1d4ac3dc87d286fdac"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.908773 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"6460038d74df47b4bd5e8f877737b675fdcc51257f17732080e42ee0a1e7dfa6"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.908795 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"3108da3bf591571013cc25e1b8f1de0c827e10b04d9686bc5e1fb47bc9778731"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.908812 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"3f89270dd151567356dcd4569c268792d8ce043f1e81df07ebe5f55f65531bca"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.956527 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.574923872 podStartE2EDuration="34.956498348s" podCreationTimestamp="2026-02-20 10:13:10 +0000 UTC" firstStartedPulling="2026-02-20 10:13:28.589434424 +0000 UTC m=+1100.171906260" lastFinishedPulling="2026-02-20 10:13:42.97100887 +0000 UTC m=+1114.553480736" observedRunningTime="2026-02-20 10:13:44.95009829 +0000 UTC m=+1116.532570136" watchObservedRunningTime="2026-02-20 10:13:44.956498348 +0000 UTC m=+1116.538970204" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.263581 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:45 crc kubenswrapper[4962]: E0220 10:13:45.264131 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4733c39-1a37-4a56-a731-88fcac6da1c0" containerName="ovn-config" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.264155 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4733c39-1a37-4a56-a731-88fcac6da1c0" containerName="ovn-config" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.264363 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4733c39-1a37-4a56-a731-88fcac6da1c0" containerName="ovn-config" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.265554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.267983 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.287393 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.383084 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht65m\" (UniqueName: \"kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.383178 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.383211 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.383900 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.384182 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.384462 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.485933 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.486004 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.486075 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.486124 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht65m\" (UniqueName: \"kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.486154 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.486182 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.487123 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.487279 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.487745 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.487975 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.488007 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.513686 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht65m\" (UniqueName: \"kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.586149 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:46 crc kubenswrapper[4962]: I0220 10:13:46.116161 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:46 crc kubenswrapper[4962]: I0220 10:13:46.928718 4962 generic.go:334] "Generic (PLEG): container finished" podID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerID="584310e205b8e31a078417aa3e179930fe706052662212b3eaf969a39ad7c786" exitCode=0 Feb 20 10:13:46 crc kubenswrapper[4962]: I0220 10:13:46.928852 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" event={"ID":"89a12a35-60fb-43fc-bd27-d7db10bc1aaa","Type":"ContainerDied","Data":"584310e205b8e31a078417aa3e179930fe706052662212b3eaf969a39ad7c786"} Feb 20 10:13:46 crc kubenswrapper[4962]: I0220 10:13:46.929516 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" event={"ID":"89a12a35-60fb-43fc-bd27-d7db10bc1aaa","Type":"ContainerStarted","Data":"6231efbcfe405ddd4c89430a5b15578c84264d02a7e2d321eede6e43d22dfa60"} Feb 20 10:13:47 crc kubenswrapper[4962]: I0220 10:13:47.942896 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" event={"ID":"89a12a35-60fb-43fc-bd27-d7db10bc1aaa","Type":"ContainerStarted","Data":"a007efc5f3ae3abc9fcc5e604d38f4026fe469792807759cefec37856ad251e4"} Feb 20 10:13:47 crc kubenswrapper[4962]: I0220 10:13:47.943407 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:47 crc kubenswrapper[4962]: I0220 10:13:47.973252 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" podStartSLOduration=2.973229483 podStartE2EDuration="2.973229483s" podCreationTimestamp="2026-02-20 10:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:13:47.967488255 +0000 UTC m=+1119.549960101" watchObservedRunningTime="2026-02-20 10:13:47.973229483 +0000 UTC m=+1119.555701319" Feb 20 10:13:48 crc kubenswrapper[4962]: I0220 10:13:48.953952 4962 generic.go:334] "Generic (PLEG): container finished" podID="3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" containerID="ae355b88f320e93105b216772d0d1821b9792d4ee89d86649fd430b7ae19d59e" exitCode=0 Feb 20 10:13:48 crc kubenswrapper[4962]: I0220 10:13:48.954062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9gcrq" event={"ID":"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1","Type":"ContainerDied","Data":"ae355b88f320e93105b216772d0d1821b9792d4ee89d86649fd430b7ae19d59e"} Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.781310 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.816462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data\") pod \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.816669 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqljl\" (UniqueName: \"kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl\") pod \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.816774 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data\") pod \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.816804 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle\") pod \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.825731 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" (UID: "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.826046 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl" (OuterVolumeSpecName: "kube-api-access-dqljl") pod "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" (UID: "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1"). InnerVolumeSpecName "kube-api-access-dqljl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.847554 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" (UID: "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.869470 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data" (OuterVolumeSpecName: "config-data") pod "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" (UID: "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.919923 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.919959 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.919974 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.919984 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqljl\" (UniqueName: \"kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.981815 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9gcrq" event={"ID":"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1","Type":"ContainerDied","Data":"a8e14a05cffa52a7e3ad38a2be0cb8a03501b42d139fbf401ec8ecee6a2bd2a6"} Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.981880 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8e14a05cffa52a7e3ad38a2be0cb8a03501b42d139fbf401ec8ecee6a2bd2a6" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.981903 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.447200 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.454345 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="dnsmasq-dns" containerID="cri-o://a007efc5f3ae3abc9fcc5e604d38f4026fe469792807759cefec37856ad251e4" gracePeriod=10 Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.503960 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:13:51 crc kubenswrapper[4962]: E0220 10:13:51.504378 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" containerName="glance-db-sync" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.504392 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" containerName="glance-db-sync" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.504582 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" containerName="glance-db-sync" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.505724 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.529650 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534272 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534311 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534355 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfvtg\" (UniqueName: \"kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534432 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636319 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfvtg\" (UniqueName: \"kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636406 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636434 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.637846 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.637915 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.637946 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.638041 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.638466 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.660858 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfvtg\" (UniqueName: \"kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.828828 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:52 crc kubenswrapper[4962]: I0220 10:13:52.004768 4962 generic.go:334] "Generic (PLEG): container finished" podID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerID="a007efc5f3ae3abc9fcc5e604d38f4026fe469792807759cefec37856ad251e4" exitCode=0 Feb 20 10:13:52 crc kubenswrapper[4962]: I0220 10:13:52.004955 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" event={"ID":"89a12a35-60fb-43fc-bd27-d7db10bc1aaa","Type":"ContainerDied","Data":"a007efc5f3ae3abc9fcc5e604d38f4026fe469792807759cefec37856ad251e4"} Feb 20 10:13:52 crc kubenswrapper[4962]: I0220 10:13:52.814262 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:13:52 crc kubenswrapper[4962]: W0220 10:13:52.825454 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90cdf678_dd6c_4f3b_a675_4803eddcfc44.slice/crio-83c0be5d3f4ec5fde60bc996f7952df21d1276689d0ec4493b4ba2dd90aa2879 WatchSource:0}: Error finding container 83c0be5d3f4ec5fde60bc996f7952df21d1276689d0ec4493b4ba2dd90aa2879: Status 404 returned error can't find the container with id 83c0be5d3f4ec5fde60bc996f7952df21d1276689d0ec4493b4ba2dd90aa2879 Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.015449 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" event={"ID":"90cdf678-dd6c-4f3b-a675-4803eddcfc44","Type":"ContainerStarted","Data":"83c0be5d3f4ec5fde60bc996f7952df21d1276689d0ec4493b4ba2dd90aa2879"} Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.505850 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608513 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608682 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608754 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608836 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht65m\" (UniqueName: \"kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608857 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608886 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.618402 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m" (OuterVolumeSpecName: "kube-api-access-ht65m") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "kube-api-access-ht65m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.667764 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.670700 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config" (OuterVolumeSpecName: "config") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.674808 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.678415 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.703074 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710023 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710047 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710060 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht65m\" (UniqueName: \"kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710071 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710081 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710091 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.029723 4962 generic.go:334] "Generic (PLEG): container finished" podID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerID="c3f233006bdf1d16d8946733067213908be75ed885abe76b0ea0e53fac4b17ed" exitCode=0 Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.029858 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" event={"ID":"90cdf678-dd6c-4f3b-a675-4803eddcfc44","Type":"ContainerDied","Data":"c3f233006bdf1d16d8946733067213908be75ed885abe76b0ea0e53fac4b17ed"} Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.032680 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" event={"ID":"89a12a35-60fb-43fc-bd27-d7db10bc1aaa","Type":"ContainerDied","Data":"6231efbcfe405ddd4c89430a5b15578c84264d02a7e2d321eede6e43d22dfa60"} Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.032763 4962 scope.go:117] "RemoveContainer" containerID="a007efc5f3ae3abc9fcc5e604d38f4026fe469792807759cefec37856ad251e4" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.032768 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.221297 4962 scope.go:117] "RemoveContainer" containerID="584310e205b8e31a078417aa3e179930fe706052662212b3eaf969a39ad7c786" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.257033 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.264406 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.578040 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.921894 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-h5ptn"] Feb 20 10:13:54 crc kubenswrapper[4962]: E0220 10:13:54.922274 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="dnsmasq-dns" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.922293 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="dnsmasq-dns" Feb 20 10:13:54 crc kubenswrapper[4962]: E0220 10:13:54.922328 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="init" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.922336 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="init" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.922499 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="dnsmasq-dns" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.923237 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.948972 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h5ptn"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.038391 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.038608 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7987d\" (UniqueName: \"kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.039537 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-11b3-account-create-update-x5n92"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.040700 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.042821 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" event={"ID":"90cdf678-dd6c-4f3b-a675-4803eddcfc44","Type":"ContainerStarted","Data":"e8217d8b6999873bb7898fa52e25d272e12c4c8cfaefad4b4afe6a4bca8b3bab"} Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.043325 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.051947 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.059212 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-11b3-account-create-update-x5n92"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.096031 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" podStartSLOduration=4.096005524 podStartE2EDuration="4.096005524s" podCreationTimestamp="2026-02-20 10:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:13:55.094353442 +0000 UTC m=+1126.676825288" watchObservedRunningTime="2026-02-20 10:13:55.096005524 +0000 UTC m=+1126.678477370" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.140838 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.141015 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbr45\" (UniqueName: \"kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.141059 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.141088 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7987d\" (UniqueName: \"kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.142815 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.154089 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" path="/var/lib/kubelet/pods/89a12a35-60fb-43fc-bd27-d7db10bc1aaa/volumes" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.160947 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7987d\" (UniqueName: \"kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.242759 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbr45\" (UniqueName: \"kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.242849 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.243943 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.248921 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.272345 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbr45\" (UniqueName: \"kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.284068 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-m26vd"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.287587 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.290635 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.291400 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-76ldt" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.292265 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.292444 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.301402 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-m26vd"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.329826 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.344796 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.344839 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmgqr\" (UniqueName: \"kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.344967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.358165 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.359332 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-758kd"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.360955 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.411763 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-758kd"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.473443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.473844 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.473871 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmgqr\" (UniqueName: \"kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.489447 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c46d-account-create-update-44g6w"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.491448 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.493848 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.503869 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.513155 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.526361 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmgqr\" (UniqueName: \"kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.528868 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c46d-account-create-update-44g6w"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.542051 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4hwp2"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.549948 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4hwp2"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.550175 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577378 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmmgk\" (UniqueName: \"kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577497 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxrtr\" (UniqueName: \"kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577554 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577582 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp8bw\" (UniqueName: \"kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577703 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.644234 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-695f-account-create-update-22t44"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.645452 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.651844 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.670376 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-695f-account-create-update-22t44"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.674970 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683770 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683812 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683854 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683879 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmmgk\" (UniqueName: \"kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683905 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvkbg\" (UniqueName: \"kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683961 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxrtr\" (UniqueName: \"kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp8bw\" (UniqueName: \"kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.684013 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.684724 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.685216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.685692 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.716912 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp8bw\" (UniqueName: \"kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.724813 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxrtr\" (UniqueName: \"kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.733116 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmmgk\" (UniqueName: \"kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.765052 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-758kd" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.785912 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.785986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvkbg\" (UniqueName: \"kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.787035 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.842136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.843503 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvkbg\" (UniqueName: \"kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.884291 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.971168 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:56.130856 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h5ptn"] Feb 20 10:13:56 crc kubenswrapper[4962]: W0220 10:13:56.179696 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode761565e_55de_43bc_b82d_95b776652b5c.slice/crio-dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef WatchSource:0}: Error finding container dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef: Status 404 returned error can't find the container with id dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:56.936723 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c46d-account-create-update-44g6w"] Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:56.954945 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-m26vd"] Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.127531 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-11b3-account-create-update-x5n92"] Feb 20 10:13:57 crc kubenswrapper[4962]: W0220 10:13:57.145809 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21296df9_6e67_4427_959d_8d67bfd1393b.slice/crio-afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3 WatchSource:0}: Error finding container afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3: Status 404 returned error can't find the container with id afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3 Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.158370 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-695f-account-create-update-22t44"] Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.173051 4962 generic.go:334] "Generic (PLEG): container finished" podID="e761565e-55de-43bc-b82d-95b776652b5c" containerID="53831e942d8d69707dcfe40655e43c5762a4d492f07b1c79ed7f413953ec5f61" exitCode=0 Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.173119 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h5ptn" event={"ID":"e761565e-55de-43bc-b82d-95b776652b5c","Type":"ContainerDied","Data":"53831e942d8d69707dcfe40655e43c5762a4d492f07b1c79ed7f413953ec5f61"} Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.173195 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h5ptn" event={"ID":"e761565e-55de-43bc-b82d-95b776652b5c","Type":"ContainerStarted","Data":"dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef"} Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.180150 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4hwp2"] Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.201205 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m26vd" event={"ID":"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408","Type":"ContainerStarted","Data":"d9fb2f47a09b42f7a4e90c5cb0cc07c8e1190c6442bbdd0ca0c2a2429a245afe"} Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.218371 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c46d-account-create-update-44g6w" event={"ID":"7e2005e0-31d4-408f-8c66-187a6dd37bcd","Type":"ContainerStarted","Data":"d1ace52df18655ab33bfdec0b202a45aaa09716c2023e587cd508d5e0ef9db45"} Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.260639 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-758kd"] Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.227287 4962 generic.go:334] "Generic (PLEG): container finished" podID="7e2005e0-31d4-408f-8c66-187a6dd37bcd" containerID="03ab33469ea979640d7188e1c0dc68dd1548a99d601929f7b4e160bee72396f3" exitCode=0 Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.227623 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c46d-account-create-update-44g6w" event={"ID":"7e2005e0-31d4-408f-8c66-187a6dd37bcd","Type":"ContainerDied","Data":"03ab33469ea979640d7188e1c0dc68dd1548a99d601929f7b4e160bee72396f3"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.229417 4962 generic.go:334] "Generic (PLEG): container finished" podID="21296df9-6e67-4427-959d-8d67bfd1393b" containerID="2c027b22cf0ba460d458ecf5143a855bd6cabc995b34bcff27678d1a95ac71b9" exitCode=0 Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.229456 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-11b3-account-create-update-x5n92" event={"ID":"21296df9-6e67-4427-959d-8d67bfd1393b","Type":"ContainerDied","Data":"2c027b22cf0ba460d458ecf5143a855bd6cabc995b34bcff27678d1a95ac71b9"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.229472 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-11b3-account-create-update-x5n92" event={"ID":"21296df9-6e67-4427-959d-8d67bfd1393b","Type":"ContainerStarted","Data":"afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.231356 4962 generic.go:334] "Generic (PLEG): container finished" podID="684fc9d7-94f0-418a-b059-e5519e6cd316" containerID="7fd77ce11ed465ec4237e46a1c362e414960d4a8e3a2e89e44d3a98f1d109ea9" exitCode=0 Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.231392 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4hwp2" event={"ID":"684fc9d7-94f0-418a-b059-e5519e6cd316","Type":"ContainerDied","Data":"7fd77ce11ed465ec4237e46a1c362e414960d4a8e3a2e89e44d3a98f1d109ea9"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.231408 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4hwp2" event={"ID":"684fc9d7-94f0-418a-b059-e5519e6cd316","Type":"ContainerStarted","Data":"f577b4be85d2e6d0328c909c0a4b5923c1ddc7a57c38be3bb1161b8b028e1173"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.232929 4962 generic.go:334] "Generic (PLEG): container finished" podID="4feedd65-778f-471c-a2bf-23af2e459685" containerID="5a9782006ca96cee05b8576db8cf67f09117b6ff20027f1e9a751d12df45c5f2" exitCode=0 Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.232980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695f-account-create-update-22t44" event={"ID":"4feedd65-778f-471c-a2bf-23af2e459685","Type":"ContainerDied","Data":"5a9782006ca96cee05b8576db8cf67f09117b6ff20027f1e9a751d12df45c5f2"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.232996 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695f-account-create-update-22t44" event={"ID":"4feedd65-778f-471c-a2bf-23af2e459685","Type":"ContainerStarted","Data":"b746d0ff7cb0abb3cce635c6667ca98effdafd59e40d357fc879d7e372ceb588"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.236811 4962 generic.go:334] "Generic (PLEG): container finished" podID="a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" containerID="aa045e6922dfe4d5b86be77916d3a6f56d92ad5d8849a14be83a3fc1d37883cc" exitCode=0 Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.236916 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-758kd" event={"ID":"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e","Type":"ContainerDied","Data":"aa045e6922dfe4d5b86be77916d3a6f56d92ad5d8849a14be83a3fc1d37883cc"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.236995 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-758kd" event={"ID":"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e","Type":"ContainerStarted","Data":"aac5b33561325fc4fe98a641b749b757545fa78da23bf1dbffc4adf1c4229064"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.660702 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.781397 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7987d\" (UniqueName: \"kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d\") pod \"e761565e-55de-43bc-b82d-95b776652b5c\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.781824 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts\") pod \"e761565e-55de-43bc-b82d-95b776652b5c\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.782652 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e761565e-55de-43bc-b82d-95b776652b5c" (UID: "e761565e-55de-43bc-b82d-95b776652b5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.795953 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d" (OuterVolumeSpecName: "kube-api-access-7987d") pod "e761565e-55de-43bc-b82d-95b776652b5c" (UID: "e761565e-55de-43bc-b82d-95b776652b5c"). InnerVolumeSpecName "kube-api-access-7987d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.885246 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7987d\" (UniqueName: \"kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.885351 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:59 crc kubenswrapper[4962]: I0220 10:13:59.249499 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:59 crc kubenswrapper[4962]: I0220 10:13:59.249488 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h5ptn" event={"ID":"e761565e-55de-43bc-b82d-95b776652b5c","Type":"ContainerDied","Data":"dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef"} Feb 20 10:13:59 crc kubenswrapper[4962]: I0220 10:13:59.249578 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef" Feb 20 10:14:01 crc kubenswrapper[4962]: I0220 10:14:01.830770 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:14:01 crc kubenswrapper[4962]: I0220 10:14:01.915192 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:14:01 crc kubenswrapper[4962]: I0220 10:14:01.915622 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="dnsmasq-dns" containerID="cri-o://025a5380bda7921dcbb477c483405adc20d83c027f38ca77224085c6cba7f4f3" gracePeriod=10 Feb 20 10:14:02 crc kubenswrapper[4962]: I0220 10:14:02.289749 4962 generic.go:334] "Generic (PLEG): container finished" podID="ced7b045-00ec-453d-9a56-b13132991e8c" containerID="025a5380bda7921dcbb477c483405adc20d83c027f38ca77224085c6cba7f4f3" exitCode=0 Feb 20 10:14:02 crc kubenswrapper[4962]: I0220 10:14:02.289800 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" event={"ID":"ced7b045-00ec-453d-9a56-b13132991e8c","Type":"ContainerDied","Data":"025a5380bda7921dcbb477c483405adc20d83c027f38ca77224085c6cba7f4f3"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.184023 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.290976 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbr45\" (UniqueName: \"kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45\") pod \"21296df9-6e67-4427-959d-8d67bfd1393b\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.291075 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts\") pod \"21296df9-6e67-4427-959d-8d67bfd1393b\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.292236 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21296df9-6e67-4427-959d-8d67bfd1393b" (UID: "21296df9-6e67-4427-959d-8d67bfd1393b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.296055 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45" (OuterVolumeSpecName: "kube-api-access-nbr45") pod "21296df9-6e67-4427-959d-8d67bfd1393b" (UID: "21296df9-6e67-4427-959d-8d67bfd1393b"). InnerVolumeSpecName "kube-api-access-nbr45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.300869 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c46d-account-create-update-44g6w" event={"ID":"7e2005e0-31d4-408f-8c66-187a6dd37bcd","Type":"ContainerDied","Data":"d1ace52df18655ab33bfdec0b202a45aaa09716c2023e587cd508d5e0ef9db45"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.300915 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ace52df18655ab33bfdec0b202a45aaa09716c2023e587cd508d5e0ef9db45" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.303078 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.303084 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-11b3-account-create-update-x5n92" event={"ID":"21296df9-6e67-4427-959d-8d67bfd1393b","Type":"ContainerDied","Data":"afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.303140 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.306488 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4hwp2" event={"ID":"684fc9d7-94f0-418a-b059-e5519e6cd316","Type":"ContainerDied","Data":"f577b4be85d2e6d0328c909c0a4b5923c1ddc7a57c38be3bb1161b8b028e1173"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.306532 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f577b4be85d2e6d0328c909c0a4b5923c1ddc7a57c38be3bb1161b8b028e1173" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.311032 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695f-account-create-update-22t44" event={"ID":"4feedd65-778f-471c-a2bf-23af2e459685","Type":"ContainerDied","Data":"b746d0ff7cb0abb3cce635c6667ca98effdafd59e40d357fc879d7e372ceb588"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.311067 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b746d0ff7cb0abb3cce635c6667ca98effdafd59e40d357fc879d7e372ceb588" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.313684 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-758kd" event={"ID":"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e","Type":"ContainerDied","Data":"aac5b33561325fc4fe98a641b749b757545fa78da23bf1dbffc4adf1c4229064"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.313738 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aac5b33561325fc4fe98a641b749b757545fa78da23bf1dbffc4adf1c4229064" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.316476 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.356105 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-758kd" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.385604 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.398384 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbr45\" (UniqueName: \"kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.398425 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.406342 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4hwp2" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.482096 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499272 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvkbg\" (UniqueName: \"kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg\") pod \"4feedd65-778f-471c-a2bf-23af2e459685\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499346 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp8bw\" (UniqueName: \"kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw\") pod \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499404 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxrtr\" (UniqueName: \"kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr\") pod \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499472 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts\") pod \"4feedd65-778f-471c-a2bf-23af2e459685\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499500 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts\") pod \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499585 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts\") pod \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.501359 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e2005e0-31d4-408f-8c66-187a6dd37bcd" (UID: "7e2005e0-31d4-408f-8c66-187a6dd37bcd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.503734 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4feedd65-778f-471c-a2bf-23af2e459685" (UID: "4feedd65-778f-471c-a2bf-23af2e459685"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.504061 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" (UID: "a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.509331 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr" (OuterVolumeSpecName: "kube-api-access-hxrtr") pod "a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" (UID: "a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e"). InnerVolumeSpecName "kube-api-access-hxrtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.510043 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw" (OuterVolumeSpecName: "kube-api-access-bp8bw") pod "7e2005e0-31d4-408f-8c66-187a6dd37bcd" (UID: "7e2005e0-31d4-408f-8c66-187a6dd37bcd"). InnerVolumeSpecName "kube-api-access-bp8bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.521799 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg" (OuterVolumeSpecName: "kube-api-access-rvkbg") pod "4feedd65-778f-471c-a2bf-23af2e459685" (UID: "4feedd65-778f-471c-a2bf-23af2e459685"). InnerVolumeSpecName "kube-api-access-rvkbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602433 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts\") pod \"684fc9d7-94f0-418a-b059-e5519e6cd316\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602509 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmmgk\" (UniqueName: \"kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk\") pod \"684fc9d7-94f0-418a-b059-e5519e6cd316\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602543 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc\") pod \"ced7b045-00ec-453d-9a56-b13132991e8c\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602627 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config\") pod \"ced7b045-00ec-453d-9a56-b13132991e8c\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602650 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb\") pod \"ced7b045-00ec-453d-9a56-b13132991e8c\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602691 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb\") pod \"ced7b045-00ec-453d-9a56-b13132991e8c\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602867 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7hmg\" (UniqueName: \"kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg\") pod \"ced7b045-00ec-453d-9a56-b13132991e8c\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603360 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvkbg\" (UniqueName: \"kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603389 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp8bw\" (UniqueName: \"kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603405 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxrtr\" (UniqueName: \"kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603419 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603451 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603465 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.604280 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "684fc9d7-94f0-418a-b059-e5519e6cd316" (UID: "684fc9d7-94f0-418a-b059-e5519e6cd316"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.607805 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk" (OuterVolumeSpecName: "kube-api-access-lmmgk") pod "684fc9d7-94f0-418a-b059-e5519e6cd316" (UID: "684fc9d7-94f0-418a-b059-e5519e6cd316"). InnerVolumeSpecName "kube-api-access-lmmgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.611401 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg" (OuterVolumeSpecName: "kube-api-access-p7hmg") pod "ced7b045-00ec-453d-9a56-b13132991e8c" (UID: "ced7b045-00ec-453d-9a56-b13132991e8c"). InnerVolumeSpecName "kube-api-access-p7hmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.645345 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ced7b045-00ec-453d-9a56-b13132991e8c" (UID: "ced7b045-00ec-453d-9a56-b13132991e8c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.649337 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ced7b045-00ec-453d-9a56-b13132991e8c" (UID: "ced7b045-00ec-453d-9a56-b13132991e8c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.655804 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config" (OuterVolumeSpecName: "config") pod "ced7b045-00ec-453d-9a56-b13132991e8c" (UID: "ced7b045-00ec-453d-9a56-b13132991e8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.668840 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ced7b045-00ec-453d-9a56-b13132991e8c" (UID: "ced7b045-00ec-453d-9a56-b13132991e8c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705480 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705524 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705538 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705551 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705564 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7hmg\" (UniqueName: \"kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705611 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705624 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmmgk\" (UniqueName: \"kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.326903 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m26vd" event={"ID":"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408","Type":"ContainerStarted","Data":"d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde"} Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.341308 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-758kd" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.344874 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.345945 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4hwp2" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.346612 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" event={"ID":"ced7b045-00ec-453d-9a56-b13132991e8c","Type":"ContainerDied","Data":"e9818b1a3f9f3197b15f5f2de8df4aeac94c0b0051e4206684bcec3fc52e8885"} Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.346710 4962 scope.go:117] "RemoveContainer" containerID="025a5380bda7921dcbb477c483405adc20d83c027f38ca77224085c6cba7f4f3" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.346988 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.347211 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.365788 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-m26vd" podStartSLOduration=3.169373699 podStartE2EDuration="9.365754413s" podCreationTimestamp="2026-02-20 10:13:55 +0000 UTC" firstStartedPulling="2026-02-20 10:13:56.976833127 +0000 UTC m=+1128.559304973" lastFinishedPulling="2026-02-20 10:14:03.173213831 +0000 UTC m=+1134.755685687" observedRunningTime="2026-02-20 10:14:04.359316863 +0000 UTC m=+1135.941788749" watchObservedRunningTime="2026-02-20 10:14:04.365754413 +0000 UTC m=+1135.948226299" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.409909 4962 scope.go:117] "RemoveContainer" containerID="7ec00e9b2989f478e117f8d08060d562a11edc343ce310e24fa477348d6aca1b" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.463662 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.469737 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:14:05 crc kubenswrapper[4962]: I0220 10:14:05.161442 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" path="/var/lib/kubelet/pods/ced7b045-00ec-453d-9a56-b13132991e8c/volumes" Feb 20 10:14:06 crc kubenswrapper[4962]: E0220 10:14:06.746794 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d2e7f05_f1f0_4619_ae07_0a7b93ad6408.slice/crio-d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d2e7f05_f1f0_4619_ae07_0a7b93ad6408.slice/crio-conmon-d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde.scope\": RecentStats: unable to find data in memory cache]" Feb 20 10:14:07 crc kubenswrapper[4962]: I0220 10:14:07.378964 4962 generic.go:334] "Generic (PLEG): container finished" podID="2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" containerID="d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde" exitCode=0 Feb 20 10:14:07 crc kubenswrapper[4962]: I0220 10:14:07.379035 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m26vd" event={"ID":"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408","Type":"ContainerDied","Data":"d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde"} Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.835030 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m26vd" Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.916995 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle\") pod \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.917191 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data\") pod \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.917238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmgqr\" (UniqueName: \"kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr\") pod \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.936501 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr" (OuterVolumeSpecName: "kube-api-access-qmgqr") pod "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" (UID: "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408"). InnerVolumeSpecName "kube-api-access-qmgqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.946707 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" (UID: "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.985428 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data" (OuterVolumeSpecName: "config-data") pod "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" (UID: "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.019810 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.019849 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.019861 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmgqr\" (UniqueName: \"kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.418248 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m26vd" event={"ID":"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408","Type":"ContainerDied","Data":"d9fb2f47a09b42f7a4e90c5cb0cc07c8e1190c6442bbdd0ca0c2a2429a245afe"} Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.418987 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9fb2f47a09b42f7a4e90c5cb0cc07c8e1190c6442bbdd0ca0c2a2429a245afe" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.419162 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m26vd" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.743568 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744250 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21296df9-6e67-4427-959d-8d67bfd1393b" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744313 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="21296df9-6e67-4427-959d-8d67bfd1393b" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744373 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684fc9d7-94f0-418a-b059-e5519e6cd316" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744443 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="684fc9d7-94f0-418a-b059-e5519e6cd316" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744496 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e761565e-55de-43bc-b82d-95b776652b5c" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744540 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e761565e-55de-43bc-b82d-95b776652b5c" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744607 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" containerName="keystone-db-sync" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744655 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" containerName="keystone-db-sync" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744708 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4feedd65-778f-471c-a2bf-23af2e459685" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744752 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4feedd65-778f-471c-a2bf-23af2e459685" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744803 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="init" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744848 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="init" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744895 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="dnsmasq-dns" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744938 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="dnsmasq-dns" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.745007 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745055 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.745125 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2005e0-31d4-408f-8c66-187a6dd37bcd" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745177 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2005e0-31d4-408f-8c66-187a6dd37bcd" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745393 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4feedd65-778f-471c-a2bf-23af2e459685" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745451 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="dnsmasq-dns" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745507 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="21296df9-6e67-4427-959d-8d67bfd1393b" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745560 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745631 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2005e0-31d4-408f-8c66-187a6dd37bcd" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745688 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="684fc9d7-94f0-418a-b059-e5519e6cd316" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745738 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e761565e-55de-43bc-b82d-95b776652b5c" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745792 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" containerName="keystone-db-sync" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.746970 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.791047 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.838159 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fc9c5"] Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.841560 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.844837 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846184 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846206 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t42qd\" (UniqueName: \"kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846229 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846326 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846394 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.847738 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.847784 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.847752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.847971 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-76ldt" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.882472 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fc9c5"] Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949004 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zchrh\" (UniqueName: \"kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949083 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949142 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949201 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949238 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949276 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t42qd\" (UniqueName: \"kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949328 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949394 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949474 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949806 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949884 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.951673 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.954478 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.958219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.958622 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.958760 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.032583 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t42qd\" (UniqueName: \"kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.060256 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zchrh\" (UniqueName: \"kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.060522 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.060653 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.060760 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.061050 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.061126 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.085768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.086512 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.089102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.094861 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.106450 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.121996 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zchrh\" (UniqueName: \"kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.144022 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.214954 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-s4qgr"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.216078 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.216348 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.226640 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-87fmw" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.226955 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.227162 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.241656 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-v7sjh"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.243163 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.253014 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.253447 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bnxb6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.260099 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.266692 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s4qgr"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.302473 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v7sjh"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370161 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370321 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370396 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370485 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjsq\" (UniqueName: \"kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370557 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370668 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370761 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frwwh\" (UniqueName: \"kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370832 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.378737 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.442069 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mk67n"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.443455 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.455314 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mk67n"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.462950 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gtm5t" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.463095 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-smcqr"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.464422 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.470511 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.485906 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.486119 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.486390 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bt79l" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.486578 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.487729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488552 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488668 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488835 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488918 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjsq\" (UniqueName: \"kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.496732 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.499169 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frwwh\" (UniqueName: \"kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.498180 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.498224 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-smcqr"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.495563 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.514882 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.517500 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.524974 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.525423 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.537256 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frwwh\" (UniqueName: \"kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.538359 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.545736 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.546493 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjsq\" (UniqueName: \"kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.570043 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.575271 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.575557 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.609753 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpbgw\" (UniqueName: \"kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.609887 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.609946 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.610025 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.610141 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.610242 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.610309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.610422 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpd4w\" (UniqueName: \"kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.613390 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.635668 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.646632 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.646955 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.672059 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.718724 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.718798 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.718859 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpd4w\" (UniqueName: \"kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.718885 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdggc\" (UniqueName: \"kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.719026 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpbgw\" (UniqueName: \"kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.719728 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720730 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720782 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720825 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720895 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720986 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.721010 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.721034 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.721063 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.721111 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.721887 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.732893 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.733936 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.734115 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.740749 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.743484 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpd4w\" (UniqueName: \"kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.743750 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpbgw\" (UniqueName: \"kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.752708 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824140 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824278 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824312 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824348 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824383 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824402 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824425 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824450 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdggc\" (UniqueName: \"kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824483 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5knl\" (UniqueName: \"kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824521 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824537 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824554 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.830843 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.831015 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.831249 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.831481 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.838608 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.839408 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.839696 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.860416 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdggc\" (UniqueName: \"kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.860659 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.917338 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928684 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928749 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928800 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5knl\" (UniqueName: \"kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928838 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928865 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.929077 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.929727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.930259 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.930832 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.931479 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.933719 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.954063 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5knl\" (UniqueName: \"kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.968108 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.014692 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.026541 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.026843 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.031142 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lzhn7" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.031383 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.031523 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.031795 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.061107 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fc9c5"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139169 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hkdb\" (UniqueName: \"kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139247 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139312 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139342 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139484 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139558 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139585 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.157928 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.167809 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.167996 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.171290 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.171608 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244094 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244163 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hkdb\" (UniqueName: \"kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244278 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244324 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244368 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244398 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244473 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.247562 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.248650 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.249813 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s4qgr"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.256155 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.257873 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.259454 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.274112 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.295646 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hkdb\" (UniqueName: \"kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.299281 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.308298 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.346960 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347044 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv9jw\" (UniqueName: \"kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347078 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347100 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347142 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347165 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347195 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347232 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.378403 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449555 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449695 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv9jw\" (UniqueName: \"kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449741 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449782 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449837 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449872 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449913 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449961 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.451288 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.451821 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.459557 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.461920 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.472639 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.462054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.481560 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v7sjh"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.483415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.511618 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv9jw\" (UniqueName: \"kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.512745 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fc9c5" event={"ID":"8741789b-8f62-4fc9-b811-b48d1f72658b","Type":"ContainerStarted","Data":"45fbf95945cbd43ce30cabe021a4e7f4f8190da7cdb4b82be6352fe237f87d78"} Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.516746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s4qgr" event={"ID":"14c237ea-eb42-49d4-90db-ee57e3b560e3","Type":"ContainerStarted","Data":"56b25d7c906db1005eebceb9a0a6f02f1965ac71cc6b3cb440b4767a03118405"} Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.525898 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" event={"ID":"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3","Type":"ContainerStarted","Data":"306dc819f0dbd42e6bb0a1d32df3e770a0d5883fd79bec0c6e92d720cd24ed11"} Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.528724 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.613651 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mk67n"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.658128 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-smcqr"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.686718 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:14:11 crc kubenswrapper[4962]: W0220 10:14:11.693020 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf95ae2eb_8d20_4549_896d_e6991bfd1e06.slice/crio-15a399f778c9e29fd2309841da0d1240c31760f2d4754f8a287a27c0d443a8fb WatchSource:0}: Error finding container 15a399f778c9e29fd2309841da0d1240c31760f2d4754f8a287a27c0d443a8fb: Status 404 returned error can't find the container with id 15a399f778c9e29fd2309841da0d1240c31760f2d4754f8a287a27c0d443a8fb Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.699546 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.801744 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.973663 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.309120 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:12 crc kubenswrapper[4962]: W0220 10:14:12.336839 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e102633_92b3_4a5f_952b_9b3d5d5c8642.slice/crio-115b83326f93f543740443d2e3a49f92750a0b5b654fc8aa7d5dccbed06506d9 WatchSource:0}: Error finding container 115b83326f93f543740443d2e3a49f92750a0b5b654fc8aa7d5dccbed06506d9: Status 404 returned error can't find the container with id 115b83326f93f543740443d2e3a49f92750a0b5b654fc8aa7d5dccbed06506d9 Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.551724 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mk67n" event={"ID":"97e25820-62eb-4ad9-92ad-471c2f0f7ed4","Type":"ContainerStarted","Data":"15539448337a5b961d9b0ef7e9cec1129487956e6df6174e8cd859d99a2fb5ff"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.555060 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerStarted","Data":"30186548c67af6b38295049a03fc70d8716a830b2fea8ffe32d0d440d68c2923"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.557861 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v7sjh" event={"ID":"6b114dbd-1f72-42c9-97c1-43795d1cf1ea","Type":"ContainerStarted","Data":"c4884098169c655124365602e35fd187fa28c946c8e4d3fb080909fa29ad7ae0"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.557888 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v7sjh" event={"ID":"6b114dbd-1f72-42c9-97c1-43795d1cf1ea","Type":"ContainerStarted","Data":"dce69904baa73ecb44c9d47c1bcf4bebcf4df75f9166693b47df46309236c541"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.563645 4962 generic.go:334] "Generic (PLEG): container finished" podID="a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" containerID="0f06d40b0139e2fd2c899292daba5486f90405f7482adbe25ea92101fbe15c2e" exitCode=0 Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.563720 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" event={"ID":"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3","Type":"ContainerDied","Data":"0f06d40b0139e2fd2c899292daba5486f90405f7482adbe25ea92101fbe15c2e"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.567776 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fc9c5" event={"ID":"8741789b-8f62-4fc9-b811-b48d1f72658b","Type":"ContainerStarted","Data":"06ae0aace60b853c3274af8b59ad6fe8fb46d990b1106c40d7696cbaaa47e13b"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.570469 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerStarted","Data":"15a399f778c9e29fd2309841da0d1240c31760f2d4754f8a287a27c0d443a8fb"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.578666 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smcqr" event={"ID":"d970dac6-1948-42dd-b5d9-c5df1b04e30d","Type":"ContainerStarted","Data":"e9a2fb5aa6c019bc1ceb09e0e16ebaf81860fd8433b5fa16ea5575cfba68806b"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.583109 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-v7sjh" podStartSLOduration=2.5830821090000002 podStartE2EDuration="2.583082109s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:12.574802391 +0000 UTC m=+1144.157274227" watchObservedRunningTime="2026-02-20 10:14:12.583082109 +0000 UTC m=+1144.165553955" Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.583319 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerStarted","Data":"115b83326f93f543740443d2e3a49f92750a0b5b654fc8aa7d5dccbed06506d9"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.605527 4962 generic.go:334] "Generic (PLEG): container finished" podID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerID="2ed06c9914b443038fe5a8020b56e2a2f1aa8bba18873866ee6a64f32e0d9f5e" exitCode=0 Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.605586 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" event={"ID":"857e020f-54d5-4980-90a2-f19d6f8b5008","Type":"ContainerDied","Data":"2ed06c9914b443038fe5a8020b56e2a2f1aa8bba18873866ee6a64f32e0d9f5e"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.605639 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" event={"ID":"857e020f-54d5-4980-90a2-f19d6f8b5008","Type":"ContainerStarted","Data":"05f3668ddf59db31b6f76b94605c89fd97fdc7f2e57b881b4ff06bffb9a82723"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.616860 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fc9c5" podStartSLOduration=3.616837328 podStartE2EDuration="3.616837328s" podCreationTimestamp="2026-02-20 10:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:12.601256364 +0000 UTC m=+1144.183728210" watchObservedRunningTime="2026-02-20 10:14:12.616837328 +0000 UTC m=+1144.199309174" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.201762 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.337870 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t42qd\" (UniqueName: \"kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.337939 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.338041 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.338159 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.338252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.338404 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.355371 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd" (OuterVolumeSpecName: "kube-api-access-t42qd") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "kube-api-access-t42qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.399130 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.412431 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.416552 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.423961 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.430442 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config" (OuterVolumeSpecName: "config") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.432898 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.441609 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.441840 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.441906 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.441959 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t42qd\" (UniqueName: \"kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.442014 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.442065 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.491946 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.510304 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.644774 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" event={"ID":"857e020f-54d5-4980-90a2-f19d6f8b5008","Type":"ContainerStarted","Data":"9cd5bd763453d1cd2f676217b16b284051d742fb526b5a0bebd50656b842e234"} Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.646070 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.655095 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerStarted","Data":"e1e172d0057e84b49e9b441fa013beb45918149a2c5015cdbce1606c11ce1b14"} Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.672995 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.673702 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" event={"ID":"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3","Type":"ContainerDied","Data":"306dc819f0dbd42e6bb0a1d32df3e770a0d5883fd79bec0c6e92d720cd24ed11"} Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.673781 4962 scope.go:117] "RemoveContainer" containerID="0f06d40b0139e2fd2c899292daba5486f90405f7482adbe25ea92101fbe15c2e" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.688376 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerStarted","Data":"30e065f1b9b01eb8f421805f853435dbc45f2b1b158fe7f5032f2d843493ba72"} Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.699946 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" podStartSLOduration=3.699909776 podStartE2EDuration="3.699909776s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:13.680605066 +0000 UTC m=+1145.263076912" watchObservedRunningTime="2026-02-20 10:14:13.699909776 +0000 UTC m=+1145.282381622" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.763085 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.773447 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:14 crc kubenswrapper[4962]: I0220 10:14:14.700218 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerStarted","Data":"6554ff74166258fd2f7adf967daf521f835c9f3c82b3de1399a5e583de7355aa"} Feb 20 10:14:14 crc kubenswrapper[4962]: I0220 10:14:14.700404 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-log" containerID="cri-o://e1e172d0057e84b49e9b441fa013beb45918149a2c5015cdbce1606c11ce1b14" gracePeriod=30 Feb 20 10:14:14 crc kubenswrapper[4962]: I0220 10:14:14.700460 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-httpd" containerID="cri-o://6554ff74166258fd2f7adf967daf521f835c9f3c82b3de1399a5e583de7355aa" gracePeriod=30 Feb 20 10:14:14 crc kubenswrapper[4962]: I0220 10:14:14.724837 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.724817835 podStartE2EDuration="5.724817835s" podCreationTimestamp="2026-02-20 10:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:14.722695719 +0000 UTC m=+1146.305167565" watchObservedRunningTime="2026-02-20 10:14:14.724817835 +0000 UTC m=+1146.307289681" Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.169933 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" path="/var/lib/kubelet/pods/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3/volumes" Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.717945 4962 generic.go:334] "Generic (PLEG): container finished" podID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerID="6554ff74166258fd2f7adf967daf521f835c9f3c82b3de1399a5e583de7355aa" exitCode=0 Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.718302 4962 generic.go:334] "Generic (PLEG): container finished" podID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerID="e1e172d0057e84b49e9b441fa013beb45918149a2c5015cdbce1606c11ce1b14" exitCode=143 Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.718019 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerDied","Data":"6554ff74166258fd2f7adf967daf521f835c9f3c82b3de1399a5e583de7355aa"} Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.718408 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerDied","Data":"e1e172d0057e84b49e9b441fa013beb45918149a2c5015cdbce1606c11ce1b14"} Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.723041 4962 generic.go:334] "Generic (PLEG): container finished" podID="8741789b-8f62-4fc9-b811-b48d1f72658b" containerID="06ae0aace60b853c3274af8b59ad6fe8fb46d990b1106c40d7696cbaaa47e13b" exitCode=0 Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.723190 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fc9c5" event={"ID":"8741789b-8f62-4fc9-b811-b48d1f72658b","Type":"ContainerDied","Data":"06ae0aace60b853c3274af8b59ad6fe8fb46d990b1106c40d7696cbaaa47e13b"} Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.726216 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerStarted","Data":"2123a048dc318acb29d67e4388556f6e325d9c8f02443300d2653d8afdf953b1"} Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.726356 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-log" containerID="cri-o://30e065f1b9b01eb8f421805f853435dbc45f2b1b158fe7f5032f2d843493ba72" gracePeriod=30 Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.726484 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-httpd" containerID="cri-o://2123a048dc318acb29d67e4388556f6e325d9c8f02443300d2653d8afdf953b1" gracePeriod=30 Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.776922 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.776894659 podStartE2EDuration="5.776894659s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:15.767351203 +0000 UTC m=+1147.349823049" watchObservedRunningTime="2026-02-20 10:14:15.776894659 +0000 UTC m=+1147.359366505" Feb 20 10:14:16 crc kubenswrapper[4962]: I0220 10:14:16.744139 4962 generic.go:334] "Generic (PLEG): container finished" podID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerID="2123a048dc318acb29d67e4388556f6e325d9c8f02443300d2653d8afdf953b1" exitCode=0 Feb 20 10:14:16 crc kubenswrapper[4962]: I0220 10:14:16.744229 4962 generic.go:334] "Generic (PLEG): container finished" podID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerID="30e065f1b9b01eb8f421805f853435dbc45f2b1b158fe7f5032f2d843493ba72" exitCode=143 Feb 20 10:14:16 crc kubenswrapper[4962]: I0220 10:14:16.744246 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerDied","Data":"2123a048dc318acb29d67e4388556f6e325d9c8f02443300d2653d8afdf953b1"} Feb 20 10:14:16 crc kubenswrapper[4962]: I0220 10:14:16.744337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerDied","Data":"30e065f1b9b01eb8f421805f853435dbc45f2b1b158fe7f5032f2d843493ba72"} Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.374552 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.450839 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.450902 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.450952 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.450984 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.451035 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.451104 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zchrh\" (UniqueName: \"kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.460676 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.465002 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh" (OuterVolumeSpecName: "kube-api-access-zchrh") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "kube-api-access-zchrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.474118 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.484121 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts" (OuterVolumeSpecName: "scripts") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.490066 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.508449 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data" (OuterVolumeSpecName: "config-data") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.553723 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.553768 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.553781 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.553791 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.554181 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.554197 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zchrh\" (UniqueName: \"kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.764530 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fc9c5" event={"ID":"8741789b-8f62-4fc9-b811-b48d1f72658b","Type":"ContainerDied","Data":"45fbf95945cbd43ce30cabe021a4e7f4f8190da7cdb4b82be6352fe237f87d78"} Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.764582 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45fbf95945cbd43ce30cabe021a4e7f4f8190da7cdb4b82be6352fe237f87d78" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.764679 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.823674 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fc9c5"] Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.833877 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fc9c5"] Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.917940 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-r4hdf"] Feb 20 10:14:17 crc kubenswrapper[4962]: E0220 10:14:17.919006 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" containerName="init" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.919116 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" containerName="init" Feb 20 10:14:17 crc kubenswrapper[4962]: E0220 10:14:17.919224 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8741789b-8f62-4fc9-b811-b48d1f72658b" containerName="keystone-bootstrap" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.919301 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8741789b-8f62-4fc9-b811-b48d1f72658b" containerName="keystone-bootstrap" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.919953 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8741789b-8f62-4fc9-b811-b48d1f72658b" containerName="keystone-bootstrap" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.920173 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" containerName="init" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.921230 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.925214 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.926101 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.926702 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.926964 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-76ldt" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.927280 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.930838 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r4hdf"] Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962692 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h688g\" (UniqueName: \"kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962753 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962834 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962902 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064154 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064235 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064263 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h688g\" (UniqueName: \"kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064284 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064332 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064393 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.069004 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.069342 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.069689 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.078372 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.078700 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.085829 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h688g\" (UniqueName: \"kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.242990 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:19 crc kubenswrapper[4962]: I0220 10:14:19.154358 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8741789b-8f62-4fc9-b811-b48d1f72658b" path="/var/lib/kubelet/pods/8741789b-8f62-4fc9-b811-b48d1f72658b/volumes" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.333994 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.465705 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.465815 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hkdb\" (UniqueName: \"kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.465897 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.466057 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.468267 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.468336 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.468520 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.468577 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.469457 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs" (OuterVolumeSpecName: "logs") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.469578 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.477793 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb" (OuterVolumeSpecName: "kube-api-access-2hkdb") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "kube-api-access-2hkdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.477895 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.491739 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts" (OuterVolumeSpecName: "scripts") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.518992 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.536756 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data" (OuterVolumeSpecName: "config-data") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.552467 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572288 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572348 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572366 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572421 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572435 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572448 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hkdb\" (UniqueName: \"kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572461 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572472 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.601557 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.676778 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.798640 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerDied","Data":"30186548c67af6b38295049a03fc70d8716a830b2fea8ffe32d0d440d68c2923"} Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.798921 4962 scope.go:117] "RemoveContainer" containerID="6554ff74166258fd2f7adf967daf521f835c9f3c82b3de1399a5e583de7355aa" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.798986 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.855764 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.870995 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.882879 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:20 crc kubenswrapper[4962]: E0220 10:14:20.883441 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-log" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.883460 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-log" Feb 20 10:14:20 crc kubenswrapper[4962]: E0220 10:14:20.883497 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-httpd" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.883507 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-httpd" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.883719 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-log" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.883744 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-httpd" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.885221 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.890201 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.890634 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.892424 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.970762 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983012 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983060 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983103 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983144 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983161 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smt92\" (UniqueName: \"kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983263 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983364 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.040578 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.040919 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" containerID="cri-o://e8217d8b6999873bb7898fa52e25d272e12c4c8cfaefad4b4afe6a4bca8b3bab" gracePeriod=10 Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.085821 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086030 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086080 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086114 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086172 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086257 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086295 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086323 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smt92\" (UniqueName: \"kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.087298 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.087514 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.088101 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.095321 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.095329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.096823 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.099525 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.108286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smt92\" (UniqueName: \"kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.148864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.167028 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" path="/var/lib/kubelet/pods/5f79e11c-6024-4774-9ed7-6d08e5b63442/volumes" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.215350 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.829974 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.837749 4962 generic.go:334] "Generic (PLEG): container finished" podID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerID="e8217d8b6999873bb7898fa52e25d272e12c4c8cfaefad4b4afe6a4bca8b3bab" exitCode=0 Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.837813 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" event={"ID":"90cdf678-dd6c-4f3b-a675-4803eddcfc44","Type":"ContainerDied","Data":"e8217d8b6999873bb7898fa52e25d272e12c4c8cfaefad4b4afe6a4bca8b3bab"} Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.461156 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.546864 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.548271 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.548322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.548456 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv9jw\" (UniqueName: \"kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.548776 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.549077 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.549345 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.549420 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.549457 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.549686 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs" (OuterVolumeSpecName: "logs") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.550454 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.550488 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.556898 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.568592 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw" (OuterVolumeSpecName: "kube-api-access-dv9jw") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "kube-api-access-dv9jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.569344 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts" (OuterVolumeSpecName: "scripts") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.606002 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.621114 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data" (OuterVolumeSpecName: "config-data") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.632303 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.652970 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.653013 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.653071 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.653086 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.653106 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv9jw\" (UniqueName: \"kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.653129 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.674551 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.755133 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.873449 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerDied","Data":"115b83326f93f543740443d2e3a49f92750a0b5b654fc8aa7d5dccbed06506d9"} Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.873830 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.917153 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.925520 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.942139 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:23 crc kubenswrapper[4962]: E0220 10:14:23.942641 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-httpd" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.942664 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-httpd" Feb 20 10:14:23 crc kubenswrapper[4962]: E0220 10:14:23.942692 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-log" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.942699 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-log" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.942884 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-log" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.942913 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-httpd" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.944167 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.947789 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.948010 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.970021 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062257 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062334 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062378 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062396 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqshj\" (UniqueName: \"kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062438 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062459 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062533 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062559 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.164970 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.165083 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.165117 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.165440 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.165804 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.165971 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqshj\" (UniqueName: \"kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.166020 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.166040 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.166474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.166761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.166922 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.171322 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.173271 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.175441 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.177345 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.183043 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqshj\" (UniqueName: \"kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.197712 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.270826 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:25 crc kubenswrapper[4962]: I0220 10:14:25.161452 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" path="/var/lib/kubelet/pods/5e102633-92b3-4a5f-952b-9b3d5d5c8642/volumes" Feb 20 10:14:26 crc kubenswrapper[4962]: I0220 10:14:26.831396 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Feb 20 10:14:29 crc kubenswrapper[4962]: I0220 10:14:29.953619 4962 generic.go:334] "Generic (PLEG): container finished" podID="6b114dbd-1f72-42c9-97c1-43795d1cf1ea" containerID="c4884098169c655124365602e35fd187fa28c946c8e4d3fb080909fa29ad7ae0" exitCode=0 Feb 20 10:14:29 crc kubenswrapper[4962]: I0220 10:14:29.953720 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v7sjh" event={"ID":"6b114dbd-1f72-42c9-97c1-43795d1cf1ea","Type":"ContainerDied","Data":"c4884098169c655124365602e35fd187fa28c946c8e4d3fb080909fa29ad7ae0"} Feb 20 10:14:31 crc kubenswrapper[4962]: I0220 10:14:31.829662 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Feb 20 10:14:31 crc kubenswrapper[4962]: I0220 10:14:31.830349 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:14:32 crc kubenswrapper[4962]: E0220 10:14:32.787793 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec" Feb 20 10:14:32 crc kubenswrapper[4962]: E0220 10:14:32.788569 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wpd4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-smcqr_openstack(d970dac6-1948-42dd-b5d9-c5df1b04e30d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:14:32 crc kubenswrapper[4962]: E0220 10:14:32.790029 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-smcqr" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" Feb 20 10:14:33 crc kubenswrapper[4962]: E0220 10:14:33.018811 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec\\\"\"" pod="openstack/barbican-db-sync-smcqr" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" Feb 20 10:14:33 crc kubenswrapper[4962]: E0220 10:14:33.314773 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5ac8ede62671a3b3695cf29bd3a6f124f27c93d1730f9030cc3daa05034d4af4" Feb 20 10:14:33 crc kubenswrapper[4962]: E0220 10:14:33.315004 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5ac8ede62671a3b3695cf29bd3a6f124f27c93d1730f9030cc3daa05034d4af4,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55dh556h5c8h587h7bh669h54bh66fh5bdh679h5bdh68dhcdh85h5dfh694h5cfh54h58ch646h77h5b9h5bh6bhb4h5cfh666h559h687h555h68ch5dcq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdggc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(f95ae2eb-8d20-4549-896d-e6991bfd1e06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.525995 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.576711 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle\") pod \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.576925 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjsq\" (UniqueName: \"kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq\") pod \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.576969 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config\") pod \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.591223 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq" (OuterVolumeSpecName: "kube-api-access-9jjsq") pod "6b114dbd-1f72-42c9-97c1-43795d1cf1ea" (UID: "6b114dbd-1f72-42c9-97c1-43795d1cf1ea"). InnerVolumeSpecName "kube-api-access-9jjsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.609302 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b114dbd-1f72-42c9-97c1-43795d1cf1ea" (UID: "6b114dbd-1f72-42c9-97c1-43795d1cf1ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.614410 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config" (OuterVolumeSpecName: "config") pod "6b114dbd-1f72-42c9-97c1-43795d1cf1ea" (UID: "6b114dbd-1f72-42c9-97c1-43795d1cf1ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.678305 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.678354 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjsq\" (UniqueName: \"kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.678370 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.021430 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v7sjh" event={"ID":"6b114dbd-1f72-42c9-97c1-43795d1cf1ea","Type":"ContainerDied","Data":"dce69904baa73ecb44c9d47c1bcf4bebcf4df75f9166693b47df46309236c541"} Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.021542 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dce69904baa73ecb44c9d47c1bcf4bebcf4df75f9166693b47df46309236c541" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.021692 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.862287 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:14:34 crc kubenswrapper[4962]: E0220 10:14:34.863102 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b114dbd-1f72-42c9-97c1-43795d1cf1ea" containerName="neutron-db-sync" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.863123 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b114dbd-1f72-42c9-97c1-43795d1cf1ea" containerName="neutron-db-sync" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.863331 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b114dbd-1f72-42c9-97c1-43795d1cf1ea" containerName="neutron-db-sync" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.864327 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.907993 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.908055 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.908086 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.908117 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.908134 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.908151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w864r\" (UniqueName: \"kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.919242 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.920812 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.927769 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.928208 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bnxb6" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.928484 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.930622 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.952576 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.962193 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009075 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009143 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009255 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009277 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009302 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w864r\" (UniqueName: \"kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009326 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q54gm\" (UniqueName: \"kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009352 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009380 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.010237 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.010387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.012163 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.012222 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.012931 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.048293 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w864r\" (UniqueName: \"kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.111928 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q54gm\" (UniqueName: \"kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.111996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.112032 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.112158 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.112207 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.118394 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.118646 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.119025 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.129780 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.130386 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q54gm\" (UniqueName: \"kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.200549 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.253971 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.436697 4962 scope.go:117] "RemoveContainer" containerID="e1e172d0057e84b49e9b441fa013beb45918149a2c5015cdbce1606c11ce1b14" Feb 20 10:14:35 crc kubenswrapper[4962]: E0220 10:14:35.458460 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b" Feb 20 10:14:35 crc kubenswrapper[4962]: E0220 10:14:35.458765 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-frwwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-s4qgr_openstack(14c237ea-eb42-49d4-90db-ee57e3b560e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:14:35 crc kubenswrapper[4962]: E0220 10:14:35.460526 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-s4qgr" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.558932 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.588225 4962 scope.go:117] "RemoveContainer" containerID="2123a048dc318acb29d67e4388556f6e325d9c8f02443300d2653d8afdf953b1" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.678920 4962 scope.go:117] "RemoveContainer" containerID="30e065f1b9b01eb8f421805f853435dbc45f2b1b158fe7f5032f2d843493ba72" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727467 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727544 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfvtg\" (UniqueName: \"kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727584 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727619 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727686 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727802 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.759911 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg" (OuterVolumeSpecName: "kube-api-access-wfvtg") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "kube-api-access-wfvtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.807815 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.831573 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfvtg\" (UniqueName: \"kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.831622 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.863450 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.870218 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.875806 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.907181 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config" (OuterVolumeSpecName: "config") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.937208 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.937246 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.937267 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.937280 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.010819 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r4hdf"] Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.059380 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mk67n" event={"ID":"97e25820-62eb-4ad9-92ad-471c2f0f7ed4","Type":"ContainerStarted","Data":"6f635f1f56319fca1af13c4d65bb4a7c7d012f95348309539e66bb9bc3885680"} Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.068022 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" event={"ID":"90cdf678-dd6c-4f3b-a675-4803eddcfc44","Type":"ContainerDied","Data":"83c0be5d3f4ec5fde60bc996f7952df21d1276689d0ec4493b4ba2dd90aa2879"} Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.068077 4962 scope.go:117] "RemoveContainer" containerID="e8217d8b6999873bb7898fa52e25d272e12c4c8cfaefad4b4afe6a4bca8b3bab" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.068206 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.078743 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r4hdf" event={"ID":"37eccece-549c-4b2f-b066-481b216d7ece","Type":"ContainerStarted","Data":"110ade6ee18c3222118e3294958d83a3df26c7aeb2099802b229b066aa852315"} Feb 20 10:14:36 crc kubenswrapper[4962]: E0220 10:14:36.089825 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b\\\"\"" pod="openstack/cinder-db-sync-s4qgr" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.176757 4962 scope.go:117] "RemoveContainer" containerID="c3f233006bdf1d16d8946733067213908be75ed885abe76b0ea0e53fac4b17ed" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.204540 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mk67n" podStartSLOduration=4.569928049 podStartE2EDuration="26.204495078s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="2026-02-20 10:14:11.668057516 +0000 UTC m=+1143.250529362" lastFinishedPulling="2026-02-20 10:14:33.302624505 +0000 UTC m=+1164.885096391" observedRunningTime="2026-02-20 10:14:36.127101851 +0000 UTC m=+1167.709573697" watchObservedRunningTime="2026-02-20 10:14:36.204495078 +0000 UTC m=+1167.786966924" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.288684 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.321191 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.349404 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.571929 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.709967 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.128238 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r4hdf" event={"ID":"37eccece-549c-4b2f-b066-481b216d7ece","Type":"ContainerStarted","Data":"f9cb69ce2f5869e2d5aa8f13c96033f3ed4a62ca0344285f07875e14d0de4351"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.136018 4962 generic.go:334] "Generic (PLEG): container finished" podID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerID="ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd" exitCode=0 Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.136109 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" event={"ID":"54637a9a-7f3e-439e-adf0-ba5b33a539d3","Type":"ContainerDied","Data":"ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.136136 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" event={"ID":"54637a9a-7f3e-439e-adf0-ba5b33a539d3","Type":"ContainerStarted","Data":"48cb63ddc063927b98c0acd0bd8342b9608beaa369b2e5045eda23591cd6bfd4"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.170376 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-r4hdf" podStartSLOduration=20.170335151 podStartE2EDuration="20.170335151s" podCreationTimestamp="2026-02-20 10:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:37.150309428 +0000 UTC m=+1168.732781274" watchObservedRunningTime="2026-02-20 10:14:37.170335151 +0000 UTC m=+1168.752806997" Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.173608 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" path="/var/lib/kubelet/pods/90cdf678-dd6c-4f3b-a675-4803eddcfc44/volumes" Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.185783 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerStarted","Data":"80442f73d0bb0b1518e80cca1be32921f22e35f7b8a9442cfe4b3a67ae521feb"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.212889 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerStarted","Data":"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.212937 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerStarted","Data":"81c44508e7e551a0ec9263f4a7d0314158cbc47cdfa61ceff1466d2aef98334e"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.453089 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:37 crc kubenswrapper[4962]: W0220 10:14:37.865123 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb8bed08_cb47_42cb_a192_2545a14e4c4b.slice/crio-9040b55e8e99f198b86e6b7541c702ac840e9ec2debbf8648bac866cfdc48248 WatchSource:0}: Error finding container 9040b55e8e99f198b86e6b7541c702ac840e9ec2debbf8648bac866cfdc48248: Status 404 returned error can't find the container with id 9040b55e8e99f198b86e6b7541c702ac840e9ec2debbf8648bac866cfdc48248 Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.241562 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerStarted","Data":"9040b55e8e99f198b86e6b7541c702ac840e9ec2debbf8648bac866cfdc48248"} Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.247030 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" event={"ID":"54637a9a-7f3e-439e-adf0-ba5b33a539d3","Type":"ContainerStarted","Data":"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563"} Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.248478 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.250166 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerStarted","Data":"eee28e4c70ffa00bd2365deec6d15fc3972d99a3f4797b06e78feaf11cf564f9"} Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.258370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerStarted","Data":"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574"} Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.285241 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" podStartSLOduration=4.285214678 podStartE2EDuration="4.285214678s" podCreationTimestamp="2026-02-20 10:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:38.277549029 +0000 UTC m=+1169.860020875" watchObservedRunningTime="2026-02-20 10:14:38.285214678 +0000 UTC m=+1169.867686524" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.329172 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:14:38 crc kubenswrapper[4962]: E0220 10:14:38.329793 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="init" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.329810 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="init" Feb 20 10:14:38 crc kubenswrapper[4962]: E0220 10:14:38.329823 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.329829 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.330043 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.331093 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.341297 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.341484 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.362931 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.369842 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ffdf447d4-qtmvr" podStartSLOduration=4.369819358 podStartE2EDuration="4.369819358s" podCreationTimestamp="2026-02-20 10:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:38.334696216 +0000 UTC m=+1169.917168082" watchObservedRunningTime="2026-02-20 10:14:38.369819358 +0000 UTC m=+1169.952291204" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514336 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514431 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514457 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62jg5\" (UniqueName: \"kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514544 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514572 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514616 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514640 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619335 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619461 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619489 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62jg5\" (UniqueName: \"kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619822 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619902 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619945 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619979 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.631539 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.638694 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.638771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.639464 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.641721 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.641738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.660417 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62jg5\" (UniqueName: \"kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.733854 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.284306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerStarted","Data":"d20fc2b7dd54adff0d815d504246ad4e77027f8694190be99bf82bc96b1f4c9f"} Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.293708 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerStarted","Data":"c820ae4ce289a934f94f300bfd5be2c53a94a21d5b0d1a615bd01eca018a9cca"} Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.330961 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerStarted","Data":"888c0df412b752895b61127294d75f746f54944c18df1a5600dd20d1b268288d"} Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.334616 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=19.334573087 podStartE2EDuration="19.334573087s" podCreationTimestamp="2026-02-20 10:14:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:39.317238198 +0000 UTC m=+1170.899710054" watchObservedRunningTime="2026-02-20 10:14:39.334573087 +0000 UTC m=+1170.917044923" Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.338275 4962 generic.go:334] "Generic (PLEG): container finished" podID="97e25820-62eb-4ad9-92ad-471c2f0f7ed4" containerID="6f635f1f56319fca1af13c4d65bb4a7c7d012f95348309539e66bb9bc3885680" exitCode=0 Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.338349 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mk67n" event={"ID":"97e25820-62eb-4ad9-92ad-471c2f0f7ed4","Type":"ContainerDied","Data":"6f635f1f56319fca1af13c4d65bb4a7c7d012f95348309539e66bb9bc3885680"} Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.338834 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.639635 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.372299 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerStarted","Data":"9861804d9a2a53f03b608fd61261417b654233e3abb2bbc1678d9e37df3e329e"} Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.823833 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918194 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpbgw\" (UniqueName: \"kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw\") pod \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918339 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle\") pod \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918384 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs\") pod \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918623 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data\") pod \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918731 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts\") pod \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918901 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs" (OuterVolumeSpecName: "logs") pod "97e25820-62eb-4ad9-92ad-471c2f0f7ed4" (UID: "97e25820-62eb-4ad9-92ad-471c2f0f7ed4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.919456 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.925841 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw" (OuterVolumeSpecName: "kube-api-access-mpbgw") pod "97e25820-62eb-4ad9-92ad-471c2f0f7ed4" (UID: "97e25820-62eb-4ad9-92ad-471c2f0f7ed4"). InnerVolumeSpecName "kube-api-access-mpbgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.926222 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts" (OuterVolumeSpecName: "scripts") pod "97e25820-62eb-4ad9-92ad-471c2f0f7ed4" (UID: "97e25820-62eb-4ad9-92ad-471c2f0f7ed4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.948466 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97e25820-62eb-4ad9-92ad-471c2f0f7ed4" (UID: "97e25820-62eb-4ad9-92ad-471c2f0f7ed4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.950809 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data" (OuterVolumeSpecName: "config-data") pod "97e25820-62eb-4ad9-92ad-471c2f0f7ed4" (UID: "97e25820-62eb-4ad9-92ad-471c2f0f7ed4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.022404 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.022457 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.022471 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpbgw\" (UniqueName: \"kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.022491 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.216181 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.216470 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.261124 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.271197 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.392201 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerStarted","Data":"ef8879302adbb806976eb02b6043e51e7d9091d75bf4488fcea19123656b2441"} Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.407604 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mk67n" event={"ID":"97e25820-62eb-4ad9-92ad-471c2f0f7ed4","Type":"ContainerDied","Data":"15539448337a5b961d9b0ef7e9cec1129487956e6df6174e8cd859d99a2fb5ff"} Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.407666 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15539448337a5b961d9b0ef7e9cec1129487956e6df6174e8cd859d99a2fb5ff" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.407939 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.432465 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerStarted","Data":"ed6b084d34e4657f5b3865a48d0866534c9fe6dd73d3021c88e80799cfa08dc0"} Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.445656 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r4hdf" event={"ID":"37eccece-549c-4b2f-b066-481b216d7ece","Type":"ContainerDied","Data":"f9cb69ce2f5869e2d5aa8f13c96033f3ed4a62ca0344285f07875e14d0de4351"} Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.445577 4962 generic.go:334] "Generic (PLEG): container finished" podID="37eccece-549c-4b2f-b066-481b216d7ece" containerID="f9cb69ce2f5869e2d5aa8f13c96033f3ed4a62ca0344285f07875e14d0de4351" exitCode=0 Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.447821 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.447862 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.497864 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:14:41 crc kubenswrapper[4962]: E0220 10:14:41.498468 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e25820-62eb-4ad9-92ad-471c2f0f7ed4" containerName="placement-db-sync" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.498490 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e25820-62eb-4ad9-92ad-471c2f0f7ed4" containerName="placement-db-sync" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.498965 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e25820-62eb-4ad9-92ad-471c2f0f7ed4" containerName="placement-db-sync" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.500281 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.506721 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.506964 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.506983 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.507271 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.509006 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gtm5t" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.544232 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.647829 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.647883 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.647954 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.647979 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.648027 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.648069 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.648110 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9vq\" (UniqueName: \"kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750087 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750151 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750209 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750246 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750286 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9vq\" (UniqueName: \"kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750308 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750332 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.752159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.756847 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.757866 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.759956 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.760156 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.760982 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.770438 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9vq\" (UniqueName: \"kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.828240 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:42 crc kubenswrapper[4962]: I0220 10:14:42.472057 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerStarted","Data":"bae43663ef81835d7b00f29e6ed99c794aa21733e839a0ebe0e90aee6573888f"} Feb 20 10:14:42 crc kubenswrapper[4962]: I0220 10:14:42.472893 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:42 crc kubenswrapper[4962]: I0220 10:14:42.472950 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:14:42 crc kubenswrapper[4962]: I0220 10:14:42.514924 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.514897468 podStartE2EDuration="19.514897468s" podCreationTimestamp="2026-02-20 10:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:42.510960555 +0000 UTC m=+1174.093432411" watchObservedRunningTime="2026-02-20 10:14:42.514897468 +0000 UTC m=+1174.097369314" Feb 20 10:14:42 crc kubenswrapper[4962]: I0220 10:14:42.554157 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-747dfbc745-ndpzt" podStartSLOduration=4.554135908 podStartE2EDuration="4.554135908s" podCreationTimestamp="2026-02-20 10:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:42.547272514 +0000 UTC m=+1174.129744360" watchObservedRunningTime="2026-02-20 10:14:42.554135908 +0000 UTC m=+1174.136607754" Feb 20 10:14:43 crc kubenswrapper[4962]: I0220 10:14:43.486319 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerStarted","Data":"c4560e14774e3c9741c91f46ea630363e7cc5935a06c720a5d083bca786e716f"} Feb 20 10:14:43 crc kubenswrapper[4962]: I0220 10:14:43.486900 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerStarted","Data":"278c9072e567ac676f1ff447db5bfcb24f5eba477a61436baf57c6f5bf95aba9"} Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.271974 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.272528 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.327121 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.340242 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.497989 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.498069 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.671468 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.683789 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.099498 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.203755 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227405 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227499 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h688g\" (UniqueName: \"kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227742 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227764 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227801 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227934 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.284568 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g" (OuterVolumeSpecName: "kube-api-access-h688g") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "kube-api-access-h688g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.285744 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts" (OuterVolumeSpecName: "scripts") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.285859 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.292744 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.308739 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.309009 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="dnsmasq-dns" containerID="cri-o://9cd5bd763453d1cd2f676217b16b284051d742fb526b5a0bebd50656b842e234" gracePeriod=10 Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.334442 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.334482 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.334492 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h688g\" (UniqueName: \"kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.334501 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.377997 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.427275 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data" (OuterVolumeSpecName: "config-data") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.437312 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.437364 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.512062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerStarted","Data":"bfe2a2311075991b6e26f61913d5319a6a3da98a5127862535ec8779ac2e9fce"} Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.512496 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.512540 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.521280 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r4hdf" event={"ID":"37eccece-549c-4b2f-b066-481b216d7ece","Type":"ContainerDied","Data":"110ade6ee18c3222118e3294958d83a3df26c7aeb2099802b229b066aa852315"} Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.521338 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="110ade6ee18c3222118e3294958d83a3df26c7aeb2099802b229b066aa852315" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.521388 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.532847 4962 generic.go:334] "Generic (PLEG): container finished" podID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerID="9cd5bd763453d1cd2f676217b16b284051d742fb526b5a0bebd50656b842e234" exitCode=0 Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.533130 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" event={"ID":"857e020f-54d5-4980-90a2-f19d6f8b5008","Type":"ContainerDied","Data":"9cd5bd763453d1cd2f676217b16b284051d742fb526b5a0bebd50656b842e234"} Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.548206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerStarted","Data":"3c6ea7a933fafcb8f42d0dff736a28242f018ae0cea296d99186f98ada602b85"} Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.550423 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-755cb8b5f4-zlzbb" podStartSLOduration=4.550402626 podStartE2EDuration="4.550402626s" podCreationTimestamp="2026-02-20 10:14:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:45.539855918 +0000 UTC m=+1177.122327764" watchObservedRunningTime="2026-02-20 10:14:45.550402626 +0000 UTC m=+1177.132874472" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.781209 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.844394 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.844515 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.844547 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.845379 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.845425 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.845530 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5knl\" (UniqueName: \"kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.849876 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl" (OuterVolumeSpecName: "kube-api-access-f5knl") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "kube-api-access-f5knl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.902088 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.908884 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.911407 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config" (OuterVolumeSpecName: "config") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.913170 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.927118 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947323 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947444 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947516 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5knl\" (UniqueName: \"kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947575 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947688 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947755 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.275106 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:14:46 crc kubenswrapper[4962]: E0220 10:14:46.275792 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="dnsmasq-dns" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.275868 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="dnsmasq-dns" Feb 20 10:14:46 crc kubenswrapper[4962]: E0220 10:14:46.275931 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="init" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.275993 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="init" Feb 20 10:14:46 crc kubenswrapper[4962]: E0220 10:14:46.276092 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eccece-549c-4b2f-b066-481b216d7ece" containerName="keystone-bootstrap" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.276156 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eccece-549c-4b2f-b066-481b216d7ece" containerName="keystone-bootstrap" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.276414 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="dnsmasq-dns" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.276480 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eccece-549c-4b2f-b066-481b216d7ece" containerName="keystone-bootstrap" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.277140 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.287777 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.288161 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.288489 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.288567 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-76ldt" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.288818 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.289333 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.313605 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353620 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353690 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvw9\" (UniqueName: \"kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353749 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353793 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353826 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353872 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353912 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353938 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.455063 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.455468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvw9\" (UniqueName: \"kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.455635 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.455790 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.455913 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.456047 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.456171 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.456326 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.462720 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.462738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.463484 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.463570 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.464020 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.469199 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.473482 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.478046 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvw9\" (UniqueName: \"kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.560364 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" event={"ID":"857e020f-54d5-4980-90a2-f19d6f8b5008","Type":"ContainerDied","Data":"05f3668ddf59db31b6f76b94605c89fd97fdc7f2e57b881b4ff06bffb9a82723"} Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.560460 4962 scope.go:117] "RemoveContainer" containerID="9cd5bd763453d1cd2f676217b16b284051d742fb526b5a0bebd50656b842e234" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.560387 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.597196 4962 scope.go:117] "RemoveContainer" containerID="2ed06c9914b443038fe5a8020b56e2a2f1aa8bba18873866ee6a64f32e0d9f5e" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.621137 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.627552 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.636889 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.892877 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.155731 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" path="/var/lib/kubelet/pods/857e020f-54d5-4980-90a2-f19d6f8b5008/volumes" Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.157234 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.587383 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b4c54c5d9-pqd8r" event={"ID":"d203fc44-5252-4dd2-98ae-66f9c139b5f5","Type":"ContainerStarted","Data":"57e3b54a0aaa3e8886ac13c31c98adf640a3207944f14271a7e3dbd0e513db14"} Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.587818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b4c54c5d9-pqd8r" event={"ID":"d203fc44-5252-4dd2-98ae-66f9c139b5f5","Type":"ContainerStarted","Data":"cf5b12fd788026ff0304070e27b4ebd505b31b0d0e831a4ccd6e51bd8bb0b383"} Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.588205 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.609516 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6b4c54c5d9-pqd8r" podStartSLOduration=1.609493982 podStartE2EDuration="1.609493982s" podCreationTimestamp="2026-02-20 10:14:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:47.607649335 +0000 UTC m=+1179.190121181" watchObservedRunningTime="2026-02-20 10:14:47.609493982 +0000 UTC m=+1179.191965828" Feb 20 10:14:48 crc kubenswrapper[4962]: I0220 10:14:48.619650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smcqr" event={"ID":"d970dac6-1948-42dd-b5d9-c5df1b04e30d","Type":"ContainerStarted","Data":"a45c4081d1cfd44304d7f3d8b40910079cb39e233843a73a0bea91a01d00d686"} Feb 20 10:14:48 crc kubenswrapper[4962]: I0220 10:14:48.643125 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-smcqr" podStartSLOduration=2.703463313 podStartE2EDuration="38.643097592s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="2026-02-20 10:14:11.683073854 +0000 UTC m=+1143.265545700" lastFinishedPulling="2026-02-20 10:14:47.622708133 +0000 UTC m=+1179.205179979" observedRunningTime="2026-02-20 10:14:48.637290901 +0000 UTC m=+1180.219762747" watchObservedRunningTime="2026-02-20 10:14:48.643097592 +0000 UTC m=+1180.225569438" Feb 20 10:14:49 crc kubenswrapper[4962]: I0220 10:14:49.190763 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 10:14:49 crc kubenswrapper[4962]: I0220 10:14:49.677506 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s4qgr" event={"ID":"14c237ea-eb42-49d4-90db-ee57e3b560e3","Type":"ContainerStarted","Data":"ce2c059ffddb8a4bd817e0bdef157eb8b02fa711cf3898a972dc3c9f08da8952"} Feb 20 10:14:49 crc kubenswrapper[4962]: I0220 10:14:49.696327 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-s4qgr" podStartSLOduration=2.415795337 podStartE2EDuration="39.696295821s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="2026-02-20 10:14:11.250803902 +0000 UTC m=+1142.833275748" lastFinishedPulling="2026-02-20 10:14:48.531304386 +0000 UTC m=+1180.113776232" observedRunningTime="2026-02-20 10:14:49.693922826 +0000 UTC m=+1181.276394672" watchObservedRunningTime="2026-02-20 10:14:49.696295821 +0000 UTC m=+1181.278767667" Feb 20 10:14:53 crc kubenswrapper[4962]: I0220 10:14:53.246811 4962 generic.go:334] "Generic (PLEG): container finished" podID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" containerID="a45c4081d1cfd44304d7f3d8b40910079cb39e233843a73a0bea91a01d00d686" exitCode=0 Feb 20 10:14:53 crc kubenswrapper[4962]: I0220 10:14:53.248823 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smcqr" event={"ID":"d970dac6-1948-42dd-b5d9-c5df1b04e30d","Type":"ContainerDied","Data":"a45c4081d1cfd44304d7f3d8b40910079cb39e233843a73a0bea91a01d00d686"} Feb 20 10:14:54 crc kubenswrapper[4962]: I0220 10:14:54.262562 4962 generic.go:334] "Generic (PLEG): container finished" podID="14c237ea-eb42-49d4-90db-ee57e3b560e3" containerID="ce2c059ffddb8a4bd817e0bdef157eb8b02fa711cf3898a972dc3c9f08da8952" exitCode=0 Feb 20 10:14:54 crc kubenswrapper[4962]: I0220 10:14:54.262725 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s4qgr" event={"ID":"14c237ea-eb42-49d4-90db-ee57e3b560e3","Type":"ContainerDied","Data":"ce2c059ffddb8a4bd817e0bdef157eb8b02fa711cf3898a972dc3c9f08da8952"} Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.705841 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.713049 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874086 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874161 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874207 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data\") pod \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874255 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874316 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874411 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpd4w\" (UniqueName: \"kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w\") pod \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874439 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frwwh\" (UniqueName: \"kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874459 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874500 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle\") pod \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.875039 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.879070 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts" (OuterVolumeSpecName: "scripts") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.880063 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.881834 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh" (OuterVolumeSpecName: "kube-api-access-frwwh") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "kube-api-access-frwwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.882439 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d970dac6-1948-42dd-b5d9-c5df1b04e30d" (UID: "d970dac6-1948-42dd-b5d9-c5df1b04e30d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.882971 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w" (OuterVolumeSpecName: "kube-api-access-wpd4w") pod "d970dac6-1948-42dd-b5d9-c5df1b04e30d" (UID: "d970dac6-1948-42dd-b5d9-c5df1b04e30d"). InnerVolumeSpecName "kube-api-access-wpd4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.903904 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.912916 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d970dac6-1948-42dd-b5d9-c5df1b04e30d" (UID: "d970dac6-1948-42dd-b5d9-c5df1b04e30d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: E0220 10:14:55.936122 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.958848 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data" (OuterVolumeSpecName: "config-data") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978705 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpd4w\" (UniqueName: \"kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978749 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frwwh\" (UniqueName: \"kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978767 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978783 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978799 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978813 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978825 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978835 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978845 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.289185 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smcqr" event={"ID":"d970dac6-1948-42dd-b5d9-c5df1b04e30d","Type":"ContainerDied","Data":"e9a2fb5aa6c019bc1ceb09e0e16ebaf81860fd8433b5fa16ea5575cfba68806b"} Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.289245 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.289272 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9a2fb5aa6c019bc1ceb09e0e16ebaf81860fd8433b5fa16ea5575cfba68806b" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.293662 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.293670 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s4qgr" event={"ID":"14c237ea-eb42-49d4-90db-ee57e3b560e3","Type":"ContainerDied","Data":"56b25d7c906db1005eebceb9a0a6f02f1965ac71cc6b3cb440b4767a03118405"} Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.293723 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56b25d7c906db1005eebceb9a0a6f02f1965ac71cc6b3cb440b4767a03118405" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.296920 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerStarted","Data":"64b0d198767190eec74d90a8a078975196c61e7b45117096eb7ad0a73bc18e8b"} Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.297193 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="ceilometer-notification-agent" containerID="cri-o://c820ae4ce289a934f94f300bfd5be2c53a94a21d5b0d1a615bd01eca018a9cca" gracePeriod=30 Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.297862 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.297994 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="sg-core" containerID="cri-o://3c6ea7a933fafcb8f42d0dff736a28242f018ae0cea296d99186f98ada602b85" gracePeriod=30 Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.298063 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="proxy-httpd" containerID="cri-o://64b0d198767190eec74d90a8a078975196c61e7b45117096eb7ad0a73bc18e8b" gracePeriod=30 Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.695786 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:14:56 crc kubenswrapper[4962]: E0220 10:14:56.696497 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" containerName="cinder-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.696512 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" containerName="cinder-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: E0220 10:14:56.696532 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" containerName="barbican-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.696539 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" containerName="barbican-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.696738 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" containerName="cinder-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.696789 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" containerName="barbican-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.698212 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.702427 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-87fmw" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.715498 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.715736 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.715858 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.723100 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.724951 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.752392 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.770097 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.807696 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.807760 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.807897 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.807949 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.807971 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwhpz\" (UniqueName: \"kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808053 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808073 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808111 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808138 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrsbl\" (UniqueName: \"kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808156 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909331 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwhpz\" (UniqueName: \"kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909435 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909469 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909496 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrsbl\" (UniqueName: \"kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909513 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909545 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909569 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909705 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909758 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909790 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909805 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.910471 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.910571 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.910780 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.918582 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.919885 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.920352 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.929920 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.930279 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.930686 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrsbl\" (UniqueName: \"kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.937576 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.938344 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.948388 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwhpz\" (UniqueName: \"kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.981835 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.983501 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.991892 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.992040 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bt79l" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.992299 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.011326 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.011408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.011444 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.011554 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mbmz\" (UniqueName: \"kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.011622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.016694 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.022404 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.037657 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.052299 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.054464 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.062752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.073970 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116197 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116287 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116326 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mbmz\" (UniqueName: \"kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116348 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116379 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116401 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v96c7\" (UniqueName: \"kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116473 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116495 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116514 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.151220 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.151862 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.152332 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.168545 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.184406 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mbmz\" (UniqueName: \"kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.253400 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.255202 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.255322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.255399 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v96c7\" (UniqueName: \"kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.255628 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.254940 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.274207 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.284660 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.304940 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.307863 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.310554 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.310711 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.313894 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.315525 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v96c7\" (UniqueName: \"kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.353388 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.358895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.358951 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.358996 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.359043 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.359066 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpkv2\" (UniqueName: \"kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.359252 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.359271 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.380417 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.384399 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.386480 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.390257 4962 generic.go:334] "Generic (PLEG): container finished" podID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerID="64b0d198767190eec74d90a8a078975196c61e7b45117096eb7ad0a73bc18e8b" exitCode=0 Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.390289 4962 generic.go:334] "Generic (PLEG): container finished" podID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerID="3c6ea7a933fafcb8f42d0dff736a28242f018ae0cea296d99186f98ada602b85" exitCode=2 Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.390309 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerDied","Data":"64b0d198767190eec74d90a8a078975196c61e7b45117096eb7ad0a73bc18e8b"} Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.390331 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerDied","Data":"3c6ea7a933fafcb8f42d0dff736a28242f018ae0cea296d99186f98ada602b85"} Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.394571 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.412450 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.419740 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.422876 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.429275 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.461176 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.461222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.461244 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.461451 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.461477 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54bp5\" (UniqueName: \"kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.462541 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.462623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.462658 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.462704 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.462738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463166 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463372 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463451 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvtmd\" (UniqueName: \"kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463492 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpkv2\" (UniqueName: \"kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463521 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463602 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463687 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.464337 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.467392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.468361 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.470888 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.474620 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.488145 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.496585 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpkv2\" (UniqueName: \"kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.565917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.565997 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566026 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvtmd\" (UniqueName: \"kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566079 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566137 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566191 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566209 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566226 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54bp5\" (UniqueName: \"kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566243 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.567198 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.571645 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.574243 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.574387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.575797 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.576014 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.576420 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.576761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.579023 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.595433 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvtmd\" (UniqueName: \"kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.598888 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54bp5\" (UniqueName: \"kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.692949 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.727317 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.747622 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.812326 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.822974 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:57 crc kubenswrapper[4962]: W0220 10:14:57.876656 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8794235c_580c_4874_94c2_3b28620e3fdb.slice/crio-9a7108c116e44ebdb8f323de21aa0e098d1a9a742001d90947de3d921b977f5d WatchSource:0}: Error finding container 9a7108c116e44ebdb8f323de21aa0e098d1a9a742001d90947de3d921b977f5d: Status 404 returned error can't find the container with id 9a7108c116e44ebdb8f323de21aa0e098d1a9a742001d90947de3d921b977f5d Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.987616 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.085100 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.234819 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.360798 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:14:58 crc kubenswrapper[4962]: W0220 10:14:58.365164 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb97f91e3_f497_47ad_8d3d_f9945b3bdc34.slice/crio-a843159467f3af1797e47fcec1255f25b565f911c5d9a7e1acd289df047ed115 WatchSource:0}: Error finding container a843159467f3af1797e47fcec1255f25b565f911c5d9a7e1acd289df047ed115: Status 404 returned error can't find the container with id a843159467f3af1797e47fcec1255f25b565f911c5d9a7e1acd289df047ed115 Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.391543 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.405199 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerStarted","Data":"a34d63171eeb032e506f3c3f6390187d10864d694aff1bd3157c782304896d3f"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.407333 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerStarted","Data":"d28e07e3df71ca3ef7da3f055fa5cde6ea2cae0c5e5a865d321a0f8d0fb07b31"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.409637 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerStarted","Data":"0bed354fd9a98e89b5d38e5675524156eb0b61c69b251716c3b22a1d0bef6443"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.411191 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" event={"ID":"b97f91e3-f497-47ad-8d3d-f9945b3bdc34","Type":"ContainerStarted","Data":"a843159467f3af1797e47fcec1255f25b565f911c5d9a7e1acd289df047ed115"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.412696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerStarted","Data":"ce6724e81ded72c0c4f8c258e7503371ea527420f9524e9628d9189a535cf4b5"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.415433 4962 generic.go:334] "Generic (PLEG): container finished" podID="8794235c-580c-4874-94c2-3b28620e3fdb" containerID="5711558b18ca02e0aedc727fd6791cedd33bb082c0d7ab4780bc410966410664" exitCode=0 Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.415515 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" event={"ID":"8794235c-580c-4874-94c2-3b28620e3fdb","Type":"ContainerDied","Data":"5711558b18ca02e0aedc727fd6791cedd33bb082c0d7ab4780bc410966410664"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.415538 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" event={"ID":"8794235c-580c-4874-94c2-3b28620e3fdb","Type":"ContainerStarted","Data":"9a7108c116e44ebdb8f323de21aa0e098d1a9a742001d90947de3d921b977f5d"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.418390 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerStarted","Data":"5638e382b35ddc2f5cb2cd42c5a2bc839053a009a82ec0299379e273d8965fd5"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.926476 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.107569 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.107813 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.107879 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.107958 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.108061 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.108150 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwhpz\" (UniqueName: \"kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.114753 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz" (OuterVolumeSpecName: "kube-api-access-nwhpz") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "kube-api-access-nwhpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.155494 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.155518 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.197471 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.197998 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.210414 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.210453 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.210464 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.210475 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwhpz\" (UniqueName: \"kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.210489 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.253531 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config" (OuterVolumeSpecName: "config") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.315182 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.471815 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" event={"ID":"8794235c-580c-4874-94c2-3b28620e3fdb","Type":"ContainerDied","Data":"9a7108c116e44ebdb8f323de21aa0e098d1a9a742001d90947de3d921b977f5d"} Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.472290 4962 scope.go:117] "RemoveContainer" containerID="5711558b18ca02e0aedc727fd6791cedd33bb082c0d7ab4780bc410966410664" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.471877 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.479135 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerStarted","Data":"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f"} Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.484920 4962 generic.go:334] "Generic (PLEG): container finished" podID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerID="b058ae21e0210458ea10ea644bf00ea0438ea36899818d84b992ed449c70fc86" exitCode=0 Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.485019 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" event={"ID":"b97f91e3-f497-47ad-8d3d-f9945b3bdc34","Type":"ContainerDied","Data":"b058ae21e0210458ea10ea644bf00ea0438ea36899818d84b992ed449c70fc86"} Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.488129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerStarted","Data":"d4bd6e492dd1bc7584580c3e1bc6a4f7a66a1d7156602f3795865e0b336514ec"} Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.488180 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerStarted","Data":"55486fc199e4a3a4fb67874630324c07abf5ab0280be238e62e377607c20060a"} Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.488687 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.489208 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.573664 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.589794 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.596010 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-547b9d9588-5gkt7" podStartSLOduration=2.595985748 podStartE2EDuration="2.595985748s" podCreationTimestamp="2026-02-20 10:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:59.555959404 +0000 UTC m=+1191.138431250" watchObservedRunningTime="2026-02-20 10:14:59.595985748 +0000 UTC m=+1191.178457594" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.698646 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.141188 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc"] Feb 20 10:15:00 crc kubenswrapper[4962]: E0220 10:15:00.145132 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8794235c-580c-4874-94c2-3b28620e3fdb" containerName="init" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.145275 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8794235c-580c-4874-94c2-3b28620e3fdb" containerName="init" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.147232 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8794235c-580c-4874-94c2-3b28620e3fdb" containerName="init" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.148271 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.152742 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.162816 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc"] Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.197513 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.243833 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnnhk\" (UniqueName: \"kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.243957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.244066 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.345714 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.345809 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.345865 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnnhk\" (UniqueName: \"kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.347216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.351525 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.366372 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnnhk\" (UniqueName: \"kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.503301 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerStarted","Data":"f526ad02da5a4878737da129e8a712cb1847bfbea7fa32b3816c5e883fd3a61f"} Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.519885 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.088796 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc"] Feb 20 10:15:01 crc kubenswrapper[4962]: W0220 10:15:01.120608 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc9c6a80_7747_461e_8f29_f371984a8c95.slice/crio-2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2 WatchSource:0}: Error finding container 2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2: Status 404 returned error can't find the container with id 2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2 Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.152071 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8794235c-580c-4874-94c2-3b28620e3fdb" path="/var/lib/kubelet/pods/8794235c-580c-4874-94c2-3b28620e3fdb/volumes" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.530323 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" event={"ID":"b97f91e3-f497-47ad-8d3d-f9945b3bdc34","Type":"ContainerStarted","Data":"8064c9f4f1fa3fedd418dd367b2c5bee617312041e75d25a0c61bc72b1e5d8dc"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.530986 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.544043 4962 generic.go:334] "Generic (PLEG): container finished" podID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerID="c820ae4ce289a934f94f300bfd5be2c53a94a21d5b0d1a615bd01eca018a9cca" exitCode=0 Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.544099 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerDied","Data":"c820ae4ce289a934f94f300bfd5be2c53a94a21d5b0d1a615bd01eca018a9cca"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.558577 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" podStartSLOduration=4.558562283 podStartE2EDuration="4.558562283s" podCreationTimestamp="2026-02-20 10:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:01.554959612 +0000 UTC m=+1193.137431458" watchObservedRunningTime="2026-02-20 10:15:01.558562283 +0000 UTC m=+1193.141034129" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.569668 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" event={"ID":"fc9c6a80-7747-461e-8f29-f371984a8c95","Type":"ContainerStarted","Data":"c54639681debdeffda54130d89e4883eb7658c42414168fa95eba4479a2f093f"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.569727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" event={"ID":"fc9c6a80-7747-461e-8f29-f371984a8c95","Type":"ContainerStarted","Data":"2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.579267 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerStarted","Data":"d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.579313 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerStarted","Data":"1a117c325a572e0a4fee70e6f72cca84b0d93bdf09ce042ac50994ca64fd3520"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.581679 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.589393 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" podStartSLOduration=1.589371111 podStartE2EDuration="1.589371111s" podCreationTimestamp="2026-02-20 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:01.587687369 +0000 UTC m=+1193.170159225" watchObservedRunningTime="2026-02-20 10:15:01.589371111 +0000 UTC m=+1193.171842957" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.590446 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerStarted","Data":"5debc339fcb891cc07e7fa0a7db99fb7f297c28473a143743938f4792107d27c"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.590506 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerStarted","Data":"6cbbafaf6ad06d0f58cf79b2da64a294b16c2b2e6931344860d8ecda539fe7b2"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.647173 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6b8479d945-8wsh9" podStartSLOduration=3.061603032 podStartE2EDuration="5.647153159s" podCreationTimestamp="2026-02-20 10:14:56 +0000 UTC" firstStartedPulling="2026-02-20 10:14:58.01507318 +0000 UTC m=+1189.597545026" lastFinishedPulling="2026-02-20 10:15:00.600623287 +0000 UTC m=+1192.183095153" observedRunningTime="2026-02-20 10:15:01.642555125 +0000 UTC m=+1193.225026971" watchObservedRunningTime="2026-02-20 10:15:01.647153159 +0000 UTC m=+1193.229625005" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.671563 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" podStartSLOduration=3.22683698 podStartE2EDuration="5.671544567s" podCreationTimestamp="2026-02-20 10:14:56 +0000 UTC" firstStartedPulling="2026-02-20 10:14:58.098518665 +0000 UTC m=+1189.680990511" lastFinishedPulling="2026-02-20 10:15:00.543226252 +0000 UTC m=+1192.125698098" observedRunningTime="2026-02-20 10:15:01.664635952 +0000 UTC m=+1193.247107798" watchObservedRunningTime="2026-02-20 10:15:01.671544567 +0000 UTC m=+1193.254016413" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.686975 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687109 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdggc\" (UniqueName: \"kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687270 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687352 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687390 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687423 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687488 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.688617 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.689169 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.689752 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.689774 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.706401 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts" (OuterVolumeSpecName: "scripts") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.707489 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc" (OuterVolumeSpecName: "kube-api-access-bdggc") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "kube-api-access-bdggc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.730668 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.749381 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.795335 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data" (OuterVolumeSpecName: "config-data") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.795754 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.795865 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.795881 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.795893 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdggc\" (UniqueName: \"kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.897794 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.605781 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerDied","Data":"15a399f778c9e29fd2309841da0d1240c31760f2d4754f8a287a27c0d443a8fb"} Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.605850 4962 scope.go:117] "RemoveContainer" containerID="64b0d198767190eec74d90a8a078975196c61e7b45117096eb7ad0a73bc18e8b" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.606053 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.616274 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerStarted","Data":"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5"} Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.616469 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api-log" containerID="cri-o://9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" gracePeriod=30 Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.616862 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.617213 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api" containerID="cri-o://e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" gracePeriod=30 Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.642517 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc9c6a80-7747-461e-8f29-f371984a8c95" containerID="c54639681debdeffda54130d89e4883eb7658c42414168fa95eba4479a2f093f" exitCode=0 Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.642635 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" event={"ID":"fc9c6a80-7747-461e-8f29-f371984a8c95","Type":"ContainerDied","Data":"c54639681debdeffda54130d89e4883eb7658c42414168fa95eba4479a2f093f"} Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.646274 4962 scope.go:117] "RemoveContainer" containerID="3c6ea7a933fafcb8f42d0dff736a28242f018ae0cea296d99186f98ada602b85" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.651666 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerStarted","Data":"565233d9beb95735adc25c4668daa315369653b7a2cdf6734ddda82b44ff3501"} Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.675994 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.675970219 podStartE2EDuration="5.675970219s" podCreationTimestamp="2026-02-20 10:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:02.661959263 +0000 UTC m=+1194.244431109" watchObservedRunningTime="2026-02-20 10:15:02.675970219 +0000 UTC m=+1194.258442065" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.679798 4962 scope.go:117] "RemoveContainer" containerID="c820ae4ce289a934f94f300bfd5be2c53a94a21d5b0d1a615bd01eca018a9cca" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.777287 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.840170 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.902811 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:02 crc kubenswrapper[4962]: E0220 10:15:02.903676 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="sg-core" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.903721 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="sg-core" Feb 20 10:15:02 crc kubenswrapper[4962]: E0220 10:15:02.903759 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="ceilometer-notification-agent" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.903769 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="ceilometer-notification-agent" Feb 20 10:15:02 crc kubenswrapper[4962]: E0220 10:15:02.903807 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="proxy-httpd" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.903817 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="proxy-httpd" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.904080 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="sg-core" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.904123 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="proxy-httpd" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.904151 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="ceilometer-notification-agent" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.906916 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.035633728 podStartE2EDuration="6.9068921s" podCreationTimestamp="2026-02-20 10:14:56 +0000 UTC" firstStartedPulling="2026-02-20 10:14:57.85750593 +0000 UTC m=+1189.439977786" lastFinishedPulling="2026-02-20 10:14:58.728764302 +0000 UTC m=+1190.311236158" observedRunningTime="2026-02-20 10:15:02.77761738 +0000 UTC m=+1194.360089226" watchObservedRunningTime="2026-02-20 10:15:02.9068921 +0000 UTC m=+1194.489363946" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.907409 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.910147 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.910806 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.927994 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.027827 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.027915 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.027974 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.027999 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.029734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.029813 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg88s\" (UniqueName: \"kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.029854 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131258 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131294 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131329 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg88s\" (UniqueName: \"kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131352 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131406 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131473 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131986 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.132235 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.138548 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.139054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.139683 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.142955 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.161150 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg88s\" (UniqueName: \"kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.168039 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" path="/var/lib/kubelet/pods/f95ae2eb-8d20-4549-896d-e6991bfd1e06/volumes" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.276496 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.469832 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.473110 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.481456 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.481801 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.516322 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.531753 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546000 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546079 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546099 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpkv2\" (UniqueName: \"kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546177 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546259 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546287 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546607 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw2np\" (UniqueName: \"kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546658 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546705 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546745 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546779 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546807 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546910 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.557923 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.558115 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs" (OuterVolumeSpecName: "logs") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.560955 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2" (OuterVolumeSpecName: "kube-api-access-fpkv2") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "kube-api-access-fpkv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.572036 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts" (OuterVolumeSpecName: "scripts") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.606923 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.621142 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data" (OuterVolumeSpecName: "config-data") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647300 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647367 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647388 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647438 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647466 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647523 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw2np\" (UniqueName: \"kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647579 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647608 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647619 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647629 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647638 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647646 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647654 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpkv2\" (UniqueName: \"kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.648370 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.652326 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.652464 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.652497 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.652681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.653480 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.665710 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw2np\" (UniqueName: \"kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667041 4962 generic.go:334] "Generic (PLEG): container finished" podID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerID="e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" exitCode=0 Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667069 4962 generic.go:334] "Generic (PLEG): container finished" podID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerID="9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" exitCode=143 Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667303 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667680 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerDied","Data":"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5"} Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667734 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerDied","Data":"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f"} Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667748 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerDied","Data":"5638e382b35ddc2f5cb2cd42c5a2bc839053a009a82ec0299379e273d8965fd5"} Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667767 4962 scope.go:117] "RemoveContainer" containerID="e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.706382 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.712760 4962 scope.go:117] "RemoveContainer" containerID="9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.716858 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.731232 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: E0220 10:15:03.731632 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.731649 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api" Feb 20 10:15:03 crc kubenswrapper[4962]: E0220 10:15:03.731710 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api-log" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.731720 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api-log" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.741711 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api-log" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.741767 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.748791 4962 scope.go:117] "RemoveContainer" containerID="e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" Feb 20 10:15:03 crc kubenswrapper[4962]: E0220 10:15:03.749669 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5\": container with ID starting with e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5 not found: ID does not exist" containerID="e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.749706 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5"} err="failed to get container status \"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5\": rpc error: code = NotFound desc = could not find container \"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5\": container with ID starting with e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5 not found: ID does not exist" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.749728 4962 scope.go:117] "RemoveContainer" containerID="9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" Feb 20 10:15:03 crc kubenswrapper[4962]: E0220 10:15:03.750422 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f\": container with ID starting with 9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f not found: ID does not exist" containerID="9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.750454 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f"} err="failed to get container status \"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f\": rpc error: code = NotFound desc = could not find container \"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f\": container with ID starting with 9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f not found: ID does not exist" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.750473 4962 scope.go:117] "RemoveContainer" containerID="e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.750830 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5"} err="failed to get container status \"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5\": rpc error: code = NotFound desc = could not find container \"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5\": container with ID starting with e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5 not found: ID does not exist" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.750853 4962 scope.go:117] "RemoveContainer" containerID="9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.751824 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f"} err="failed to get container status \"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f\": rpc error: code = NotFound desc = could not find container \"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f\": container with ID starting with 9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f not found: ID does not exist" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.752551 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.752706 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.758171 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.758292 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.758293 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.815494 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.858774 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.858861 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvdkr\" (UniqueName: \"kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.858909 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.858952 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.858992 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.859035 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.859075 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.859097 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.859152 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.929944 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: W0220 10:15:03.962742 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18d02cdb_5de5_457e_9f17_1cc3ba51ca55.slice/crio-26bc9af9660938fb18d75442b06d47f88d7b3c3c743cee58b4e39774f040ef5e WatchSource:0}: Error finding container 26bc9af9660938fb18d75442b06d47f88d7b3c3c743cee58b4e39774f040ef5e: Status 404 returned error can't find the container with id 26bc9af9660938fb18d75442b06d47f88d7b3c3c743cee58b4e39774f040ef5e Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964453 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvdkr\" (UniqueName: \"kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964575 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964626 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964671 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964714 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964783 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.967377 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.967787 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.974054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.974176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.976087 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.976727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.978003 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.981477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.985844 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvdkr\" (UniqueName: \"kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.083283 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.143858 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.168086 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume\") pod \"fc9c6a80-7747-461e-8f29-f371984a8c95\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.168153 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnnhk\" (UniqueName: \"kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk\") pod \"fc9c6a80-7747-461e-8f29-f371984a8c95\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.168253 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume\") pod \"fc9c6a80-7747-461e-8f29-f371984a8c95\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.169516 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume" (OuterVolumeSpecName: "config-volume") pod "fc9c6a80-7747-461e-8f29-f371984a8c95" (UID: "fc9c6a80-7747-461e-8f29-f371984a8c95"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.173575 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk" (OuterVolumeSpecName: "kube-api-access-lnnhk") pod "fc9c6a80-7747-461e-8f29-f371984a8c95" (UID: "fc9c6a80-7747-461e-8f29-f371984a8c95"). InnerVolumeSpecName "kube-api-access-lnnhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.174496 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fc9c6a80-7747-461e-8f29-f371984a8c95" (UID: "fc9c6a80-7747-461e-8f29-f371984a8c95"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.273391 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.273438 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnnhk\" (UniqueName: \"kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.273455 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.351896 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:15:04 crc kubenswrapper[4962]: W0220 10:15:04.357547 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10c1a487_1a74_4994_9b39_f05cbe0fa5c7.slice/crio-3a9d85e1ad92d2243530d4e2efdb0f3c712197cf6ab61af23aeb5feca6269a13 WatchSource:0}: Error finding container 3a9d85e1ad92d2243530d4e2efdb0f3c712197cf6ab61af23aeb5feca6269a13: Status 404 returned error can't find the container with id 3a9d85e1ad92d2243530d4e2efdb0f3c712197cf6ab61af23aeb5feca6269a13 Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.627540 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:04 crc kubenswrapper[4962]: W0220 10:15:04.676943 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89dbdc4c_bf31_402e_b5bf_e8bbb8c16172.slice/crio-6493293a11e7a20494076438c227d47d6ea680b9e8bbd314969ad609945e742d WatchSource:0}: Error finding container 6493293a11e7a20494076438c227d47d6ea680b9e8bbd314969ad609945e742d: Status 404 returned error can't find the container with id 6493293a11e7a20494076438c227d47d6ea680b9e8bbd314969ad609945e742d Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.705700 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerStarted","Data":"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2"} Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.705754 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerStarted","Data":"3a9d85e1ad92d2243530d4e2efdb0f3c712197cf6ab61af23aeb5feca6269a13"} Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.713379 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerStarted","Data":"eadd6481e402308012bf3cb666163e04ad34b5f97c3780834cf912ecffa0bc84"} Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.713410 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerStarted","Data":"26bc9af9660938fb18d75442b06d47f88d7b3c3c743cee58b4e39774f040ef5e"} Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.718917 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" event={"ID":"fc9c6a80-7747-461e-8f29-f371984a8c95","Type":"ContainerDied","Data":"2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2"} Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.718967 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.719029 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.161275 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" path="/var/lib/kubelet/pods/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a/volumes" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.275119 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.719228 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.720075 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-747dfbc745-ndpzt" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-api" containerID="cri-o://ef8879302adbb806976eb02b6043e51e7d9091d75bf4488fcea19123656b2441" gracePeriod=30 Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.721152 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-747dfbc745-ndpzt" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" containerID="cri-o://bae43663ef81835d7b00f29e6ed99c794aa21733e839a0ebe0e90aee6573888f" gracePeriod=30 Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.729064 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-747dfbc745-ndpzt" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": EOF" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.744683 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:15:05 crc kubenswrapper[4962]: E0220 10:15:05.745253 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9c6a80-7747-461e-8f29-f371984a8c95" containerName="collect-profiles" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.745275 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9c6a80-7747-461e-8f29-f371984a8c95" containerName="collect-profiles" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.745471 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9c6a80-7747-461e-8f29-f371984a8c95" containerName="collect-profiles" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.746815 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.752697 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.787690 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerStarted","Data":"884bc1c8181b4c262029886834ab81b7b4d9ec6c2cb240fa256637239957413f"} Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.801013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerStarted","Data":"aa52f40e409ac825205d183f70f7cf56df81e106f777a2fe46a3166fb938361b"} Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.801068 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerStarted","Data":"6493293a11e7a20494076438c227d47d6ea680b9e8bbd314969ad609945e742d"} Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.815142 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerStarted","Data":"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515"} Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.815624 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.815710 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.855460 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84464996cb-fhnvz" podStartSLOduration=2.855440544 podStartE2EDuration="2.855440544s" podCreationTimestamp="2026-02-20 10:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:05.844361419 +0000 UTC m=+1197.426833265" watchObservedRunningTime="2026-02-20 10:15:05.855440544 +0000 UTC m=+1197.437912390" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920362 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920471 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920563 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920638 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920664 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920741 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kdsn\" (UniqueName: \"kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920807 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023054 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023139 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023175 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023192 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023239 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kdsn\" (UniqueName: \"kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023264 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023303 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.029484 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.030785 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.031088 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.031276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.034155 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.049471 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kdsn\" (UniqueName: \"kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.056178 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.146007 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.827206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerStarted","Data":"3dfd1ca1a6fa866b05ee6afe9233f010951bb6a3a4b6711a05320da1e6e882be"} Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.831394 4962 generic.go:334] "Generic (PLEG): container finished" podID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerID="bae43663ef81835d7b00f29e6ed99c794aa21733e839a0ebe0e90aee6573888f" exitCode=0 Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.831489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerDied","Data":"bae43663ef81835d7b00f29e6ed99c794aa21733e839a0ebe0e90aee6573888f"} Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.850484 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerStarted","Data":"7c19f6ab819e8b088592bd7831817812900bca1c0cc3649a9662bfcc1aa1ae48"} Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.897683 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.897661221 podStartE2EDuration="3.897661221s" podCreationTimestamp="2026-02-20 10:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:06.872638813 +0000 UTC m=+1198.455110669" watchObservedRunningTime="2026-02-20 10:15:06.897661221 +0000 UTC m=+1198.480133067" Feb 20 10:15:06 crc kubenswrapper[4962]: W0220 10:15:06.983556 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6ce3f9c_b8d2_4c53_a494_3aa01ec4f9b3.slice/crio-4d1717c6f2d95b6886c02fd175b76f1f1a5915a4672d75c1da15401a0d992411 WatchSource:0}: Error finding container 4d1717c6f2d95b6886c02fd175b76f1f1a5915a4672d75c1da15401a0d992411: Status 404 returned error can't find the container with id 4d1717c6f2d95b6886c02fd175b76f1f1a5915a4672d75c1da15401a0d992411 Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.986448 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.022633 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.287631 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.729936 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.800703 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.801023 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="dnsmasq-dns" containerID="cri-o://04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563" gracePeriod=10 Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.878103 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerStarted","Data":"731c2e1dae94781e12c80ac05ffd0b3634529739ec574c2b3459d53ff4dd175f"} Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.878254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerStarted","Data":"45e715a9f15469232fd9eda659480065c452b6d474e0d50459f16eb16fcf18e3"} Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.878289 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerStarted","Data":"4d1717c6f2d95b6886c02fd175b76f1f1a5915a4672d75c1da15401a0d992411"} Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.878424 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.878903 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.934031 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5dfd6b5f7f-dkfsl" podStartSLOduration=2.934014337 podStartE2EDuration="2.934014337s" podCreationTimestamp="2026-02-20 10:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:07.928420883 +0000 UTC m=+1199.510892729" watchObservedRunningTime="2026-02-20 10:15:07.934014337 +0000 UTC m=+1199.516486183" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.935760 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.421971 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589145 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589256 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589315 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589459 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589609 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589725 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w864r\" (UniqueName: \"kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.602160 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r" (OuterVolumeSpecName: "kube-api-access-w864r") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "kube-api-access-w864r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.649484 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config" (OuterVolumeSpecName: "config") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.653398 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.659743 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.666373 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.667527 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695316 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w864r\" (UniqueName: \"kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695361 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695372 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695384 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695398 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695412 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.735962 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-747dfbc745-ndpzt" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": dial tcp 10.217.0.152:9696: connect: connection refused" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.890042 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerStarted","Data":"eb0e72879a25aa9037cdc6292f11db919678280642193cd3d3aba5fbe6ccd606"} Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.890279 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.892259 4962 generic.go:334] "Generic (PLEG): container finished" podID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerID="04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563" exitCode=0 Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.892321 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.892381 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" event={"ID":"54637a9a-7f3e-439e-adf0-ba5b33a539d3","Type":"ContainerDied","Data":"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563"} Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.892451 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" event={"ID":"54637a9a-7f3e-439e-adf0-ba5b33a539d3","Type":"ContainerDied","Data":"48cb63ddc063927b98c0acd0bd8342b9608beaa369b2e5045eda23591cd6bfd4"} Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.892490 4962 scope.go:117] "RemoveContainer" containerID="04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.893906 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="cinder-scheduler" containerID="cri-o://f526ad02da5a4878737da129e8a712cb1847bfbea7fa32b3816c5e883fd3a61f" gracePeriod=30 Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.893971 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="probe" containerID="cri-o://565233d9beb95735adc25c4668daa315369653b7a2cdf6734ddda82b44ff3501" gracePeriod=30 Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.921802 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.191846969 podStartE2EDuration="6.92177431s" podCreationTimestamp="2026-02-20 10:15:02 +0000 UTC" firstStartedPulling="2026-02-20 10:15:03.966272201 +0000 UTC m=+1195.548744047" lastFinishedPulling="2026-02-20 10:15:07.696199542 +0000 UTC m=+1199.278671388" observedRunningTime="2026-02-20 10:15:08.919037235 +0000 UTC m=+1200.501509091" watchObservedRunningTime="2026-02-20 10:15:08.92177431 +0000 UTC m=+1200.504246156" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.929657 4962 scope.go:117] "RemoveContainer" containerID="ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.959635 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.970746 4962 scope.go:117] "RemoveContainer" containerID="04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563" Feb 20 10:15:08 crc kubenswrapper[4962]: E0220 10:15:08.971497 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563\": container with ID starting with 04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563 not found: ID does not exist" containerID="04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.971560 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563"} err="failed to get container status \"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563\": rpc error: code = NotFound desc = could not find container \"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563\": container with ID starting with 04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563 not found: ID does not exist" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.971652 4962 scope.go:117] "RemoveContainer" containerID="ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd" Feb 20 10:15:08 crc kubenswrapper[4962]: E0220 10:15:08.972206 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd\": container with ID starting with ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd not found: ID does not exist" containerID="ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.972230 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd"} err="failed to get container status \"ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd\": rpc error: code = NotFound desc = could not find container \"ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd\": container with ID starting with ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd not found: ID does not exist" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.975079 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:15:09 crc kubenswrapper[4962]: I0220 10:15:09.179890 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" path="/var/lib/kubelet/pods/54637a9a-7f3e-439e-adf0-ba5b33a539d3/volumes" Feb 20 10:15:09 crc kubenswrapper[4962]: I0220 10:15:09.582036 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:15:09 crc kubenswrapper[4962]: I0220 10:15:09.628190 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:15:10 crc kubenswrapper[4962]: I0220 10:15:10.973435 4962 generic.go:334] "Generic (PLEG): container finished" podID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerID="ef8879302adbb806976eb02b6043e51e7d9091d75bf4488fcea19123656b2441" exitCode=0 Feb 20 10:15:10 crc kubenswrapper[4962]: I0220 10:15:10.973628 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerDied","Data":"ef8879302adbb806976eb02b6043e51e7d9091d75bf4488fcea19123656b2441"} Feb 20 10:15:10 crc kubenswrapper[4962]: I0220 10:15:10.981823 4962 generic.go:334] "Generic (PLEG): container finished" podID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerID="565233d9beb95735adc25c4668daa315369653b7a2cdf6734ddda82b44ff3501" exitCode=0 Feb 20 10:15:10 crc kubenswrapper[4962]: I0220 10:15:10.981874 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerDied","Data":"565233d9beb95735adc25c4668daa315369653b7a2cdf6734ddda82b44ff3501"} Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.538033 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575339 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575417 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575509 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575558 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575722 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575786 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62jg5\" (UniqueName: \"kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575840 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.585204 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5" (OuterVolumeSpecName: "kube-api-access-62jg5") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "kube-api-access-62jg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.586675 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.633928 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config" (OuterVolumeSpecName: "config") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.637155 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.646219 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.663768 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677285 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677345 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677356 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677368 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677377 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62jg5\" (UniqueName: \"kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677391 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.679563 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.779023 4962 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.043231 4962 generic.go:334] "Generic (PLEG): container finished" podID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerID="f526ad02da5a4878737da129e8a712cb1847bfbea7fa32b3816c5e883fd3a61f" exitCode=0 Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.043384 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerDied","Data":"f526ad02da5a4878737da129e8a712cb1847bfbea7fa32b3816c5e883fd3a61f"} Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.049181 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerDied","Data":"9861804d9a2a53f03b608fd61261417b654233e3abb2bbc1678d9e37df3e329e"} Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.049314 4962 scope.go:117] "RemoveContainer" containerID="bae43663ef81835d7b00f29e6ed99c794aa21733e839a0ebe0e90aee6573888f" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.049559 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.107482 4962 scope.go:117] "RemoveContainer" containerID="ef8879302adbb806976eb02b6043e51e7d9091d75bf4488fcea19123656b2441" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.120160 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.127352 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.339182 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.496146 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.496737 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.497070 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrsbl\" (UniqueName: \"kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.497303 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.497544 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.497628 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.497772 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.498975 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.504757 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts" (OuterVolumeSpecName: "scripts") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.504920 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl" (OuterVolumeSpecName: "kube-api-access-mrsbl") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "kube-api-access-mrsbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.505837 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.575430 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.601426 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.601484 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.601496 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrsbl\" (UniqueName: \"kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.601510 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.643848 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data" (OuterVolumeSpecName: "config-data") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.703744 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.877110 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.881014 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.063445 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerDied","Data":"d28e07e3df71ca3ef7da3f055fa5cde6ea2cae0c5e5a865d321a0f8d0fb07b31"} Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.063516 4962 scope.go:117] "RemoveContainer" containerID="565233d9beb95735adc25c4668daa315369653b7a2cdf6734ddda82b44ff3501" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.063658 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.102609 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.110133 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.123808 4962 scope.go:117] "RemoveContainer" containerID="f526ad02da5a4878737da129e8a712cb1847bfbea7fa32b3816c5e883fd3a61f" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.135144 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.136364 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="cinder-scheduler" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.136446 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="cinder-scheduler" Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.136511 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.136564 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.136657 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="init" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.136717 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="init" Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.136774 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="probe" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.136827 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="probe" Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.138140 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-api" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138251 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-api" Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.138341 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="dnsmasq-dns" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138400 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="dnsmasq-dns" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138742 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="probe" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138827 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="dnsmasq-dns" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138889 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-api" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138945 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.139009 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="cinder-scheduler" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.140325 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.153511 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.158493 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" path="/var/lib/kubelet/pods/4839dc9e-3bbd-48e3-b839-40929e67ce7a/volumes" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.159228 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" path="/var/lib/kubelet/pods/51d56dfc-4e59-4c3d-b26d-a06301f274c8/volumes" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.160000 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.284859 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.287503 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.297889 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324145 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324186 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324254 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5494k\" (UniqueName: \"kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324333 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324352 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324384 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.426322 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9xjl\" (UniqueName: \"kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.426400 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427250 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427353 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427397 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427507 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427548 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427663 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427786 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427852 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5494k\" (UniqueName: \"kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427955 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427988 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.435973 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.436075 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.438130 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.439165 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.443884 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5494k\" (UniqueName: \"kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.474971 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.530798 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.530932 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.530972 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.531078 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9xjl\" (UniqueName: \"kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.531116 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.531156 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.531313 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.533614 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.538162 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.539075 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.542940 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.550389 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.554752 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9xjl\" (UniqueName: \"kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.556554 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.646311 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:14 crc kubenswrapper[4962]: I0220 10:15:14.056111 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:15:14 crc kubenswrapper[4962]: I0220 10:15:14.100051 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerStarted","Data":"25d2258a03970a75594e5384f741d4a8aaad9e37d3b0b7c512e80fa795dc3283"} Feb 20 10:15:14 crc kubenswrapper[4962]: I0220 10:15:14.138615 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.127481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerStarted","Data":"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548"} Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.127885 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerStarted","Data":"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53"} Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.127934 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.127976 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.133984 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerStarted","Data":"f20b981aacdf6de658de3f762f39158362f94f8752f0a75fc0ae9dfa445ad0b1"} Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.134046 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerStarted","Data":"b094901d04b6844ae7ff61500f6dbd375cab8bf6c8a00346003f62e1a980cada"} Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.156727 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-687f4cff74-gmh4w" podStartSLOduration=2.156705944 podStartE2EDuration="2.156705944s" podCreationTimestamp="2026-02-20 10:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:15.152146762 +0000 UTC m=+1206.734618608" watchObservedRunningTime="2026-02-20 10:15:15.156705944 +0000 UTC m=+1206.739177790" Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.453304 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.481079 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.588929 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.589606 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-547b9d9588-5gkt7" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api-log" containerID="cri-o://55486fc199e4a3a4fb67874630324c07abf5ab0280be238e62e377607c20060a" gracePeriod=30 Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.589829 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-547b9d9588-5gkt7" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api" containerID="cri-o://d4bd6e492dd1bc7584580c3e1bc6a4f7a66a1d7156602f3795865e0b336514ec" gracePeriod=30 Feb 20 10:15:16 crc kubenswrapper[4962]: I0220 10:15:16.159260 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerStarted","Data":"c0eb68155798173ab5bc0e3d87fda35f3734305779104c33299016d17b9b3def"} Feb 20 10:15:16 crc kubenswrapper[4962]: I0220 10:15:16.186649 4962 generic.go:334] "Generic (PLEG): container finished" podID="2320213f-c3b3-4074-95f9-ad86446193b3" containerID="55486fc199e4a3a4fb67874630324c07abf5ab0280be238e62e377607c20060a" exitCode=143 Feb 20 10:15:16 crc kubenswrapper[4962]: I0220 10:15:16.187945 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerDied","Data":"55486fc199e4a3a4fb67874630324c07abf5ab0280be238e62e377607c20060a"} Feb 20 10:15:16 crc kubenswrapper[4962]: I0220 10:15:16.191515 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.1914975 podStartE2EDuration="3.1914975s" podCreationTimestamp="2026-02-20 10:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:16.186414212 +0000 UTC m=+1207.768886058" watchObservedRunningTime="2026-02-20 10:15:16.1914975 +0000 UTC m=+1207.773969346" Feb 20 10:15:16 crc kubenswrapper[4962]: I0220 10:15:16.308974 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 20 10:15:18 crc kubenswrapper[4962]: I0220 10:15:18.452954 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:15:18 crc kubenswrapper[4962]: I0220 10:15:18.475327 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 10:15:18 crc kubenswrapper[4962]: I0220 10:15:18.805233 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-547b9d9588-5gkt7" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:44316->10.217.0.161:9311: read: connection reset by peer" Feb 20 10:15:18 crc kubenswrapper[4962]: I0220 10:15:18.805216 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-547b9d9588-5gkt7" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:44312->10.217.0.161:9311: read: connection reset by peer" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.224929 4962 generic.go:334] "Generic (PLEG): container finished" podID="2320213f-c3b3-4074-95f9-ad86446193b3" containerID="d4bd6e492dd1bc7584580c3e1bc6a4f7a66a1d7156602f3795865e0b336514ec" exitCode=0 Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.224997 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerDied","Data":"d4bd6e492dd1bc7584580c3e1bc6a4f7a66a1d7156602f3795865e0b336514ec"} Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.321524 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.515757 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data\") pod \"2320213f-c3b3-4074-95f9-ad86446193b3\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.515980 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs\") pod \"2320213f-c3b3-4074-95f9-ad86446193b3\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.516045 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle\") pod \"2320213f-c3b3-4074-95f9-ad86446193b3\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.516162 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom\") pod \"2320213f-c3b3-4074-95f9-ad86446193b3\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.516329 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvtmd\" (UniqueName: \"kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd\") pod \"2320213f-c3b3-4074-95f9-ad86446193b3\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.516924 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs" (OuterVolumeSpecName: "logs") pod "2320213f-c3b3-4074-95f9-ad86446193b3" (UID: "2320213f-c3b3-4074-95f9-ad86446193b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.537103 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd" (OuterVolumeSpecName: "kube-api-access-pvtmd") pod "2320213f-c3b3-4074-95f9-ad86446193b3" (UID: "2320213f-c3b3-4074-95f9-ad86446193b3"). InnerVolumeSpecName "kube-api-access-pvtmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.537284 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2320213f-c3b3-4074-95f9-ad86446193b3" (UID: "2320213f-c3b3-4074-95f9-ad86446193b3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.561714 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2320213f-c3b3-4074-95f9-ad86446193b3" (UID: "2320213f-c3b3-4074-95f9-ad86446193b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.576032 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data" (OuterVolumeSpecName: "config-data") pod "2320213f-c3b3-4074-95f9-ad86446193b3" (UID: "2320213f-c3b3-4074-95f9-ad86446193b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.618709 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.618747 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.618762 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.618778 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.618793 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvtmd\" (UniqueName: \"kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.995754 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 10:15:19 crc kubenswrapper[4962]: E0220 10:15:19.996910 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api-log" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.997105 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api-log" Feb 20 10:15:19 crc kubenswrapper[4962]: E0220 10:15:19.997334 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.997518 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.998068 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.998264 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api-log" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.000044 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.005067 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.005085 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-s8k4c" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.005491 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.037268 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.128006 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.128414 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.128576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.130336 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdngd\" (UniqueName: \"kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.234741 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.235379 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.236697 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.236927 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdngd\" (UniqueName: \"kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.239642 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.246912 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.249698 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerDied","Data":"ce6724e81ded72c0c4f8c258e7503371ea527420f9524e9628d9189a535cf4b5"} Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.249753 4962 scope.go:117] "RemoveContainer" containerID="d4bd6e492dd1bc7584580c3e1bc6a4f7a66a1d7156602f3795865e0b336514ec" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.249962 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.257704 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.271324 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdngd\" (UniqueName: \"kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.329780 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.468246 4962 scope.go:117] "RemoveContainer" containerID="55486fc199e4a3a4fb67874630324c07abf5ab0280be238e62e377607c20060a" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.475039 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.501411 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.884128 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 10:15:20 crc kubenswrapper[4962]: W0220 10:15:20.894231 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod755ca463_8c62_402c_8a88_a066fb38b521.slice/crio-f9306bfc7bc5868c5a90d617919c8c8dae64403806b069b14336f9858d5a16eb WatchSource:0}: Error finding container f9306bfc7bc5868c5a90d617919c8c8dae64403806b069b14336f9858d5a16eb: Status 404 returned error can't find the container with id f9306bfc7bc5868c5a90d617919c8c8dae64403806b069b14336f9858d5a16eb Feb 20 10:15:21 crc kubenswrapper[4962]: I0220 10:15:21.169476 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" path="/var/lib/kubelet/pods/2320213f-c3b3-4074-95f9-ad86446193b3/volumes" Feb 20 10:15:21 crc kubenswrapper[4962]: I0220 10:15:21.261988 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"755ca463-8c62-402c-8a88-a066fb38b521","Type":"ContainerStarted","Data":"f9306bfc7bc5868c5a90d617919c8c8dae64403806b069b14336f9858d5a16eb"} Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.203813 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.205838 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.208855 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.208864 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.211384 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.222071 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.230495 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsclb\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.230974 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231187 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231457 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231780 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231862 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.334158 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.334271 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335435 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335462 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335489 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsclb\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335519 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335546 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335588 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.336559 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.336936 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.341423 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.341748 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.341952 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.343118 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.347289 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.352563 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsclb\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.550367 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.768898 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.004473 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:15:24 crc kubenswrapper[4962]: W0220 10:15:24.010579 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod559addbd_1bc6_4146_9a27_ce3e1d3d08fd.slice/crio-cc47509aa1ca6c26cc469e518128ac8e3dbaf917ad6f17beac89df46710d9f73 WatchSource:0}: Error finding container cc47509aa1ca6c26cc469e518128ac8e3dbaf917ad6f17beac89df46710d9f73: Status 404 returned error can't find the container with id cc47509aa1ca6c26cc469e518128ac8e3dbaf917ad6f17beac89df46710d9f73 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.095886 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.096655 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-central-agent" containerID="cri-o://eadd6481e402308012bf3cb666163e04ad34b5f97c3780834cf912ecffa0bc84" gracePeriod=30 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.096794 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="sg-core" containerID="cri-o://3dfd1ca1a6fa866b05ee6afe9233f010951bb6a3a4b6711a05320da1e6e882be" gracePeriod=30 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.096856 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-notification-agent" containerID="cri-o://884bc1c8181b4c262029886834ab81b7b4d9ec6c2cb240fa256637239957413f" gracePeriod=30 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.097327 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="proxy-httpd" containerID="cri-o://eb0e72879a25aa9037cdc6292f11db919678280642193cd3d3aba5fbe6ccd606" gracePeriod=30 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.109583 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.163:3000/\": EOF" Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.310104 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerStarted","Data":"68a406d2a6eadc4116c120af687c887ef22a20b066ec54d2d6991bd97aaef0e9"} Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.310164 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerStarted","Data":"cc47509aa1ca6c26cc469e518128ac8e3dbaf917ad6f17beac89df46710d9f73"} Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.318319 4962 generic.go:334] "Generic (PLEG): container finished" podID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerID="3dfd1ca1a6fa866b05ee6afe9233f010951bb6a3a4b6711a05320da1e6e882be" exitCode=2 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.318393 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerDied","Data":"3dfd1ca1a6fa866b05ee6afe9233f010951bb6a3a4b6711a05320da1e6e882be"} Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.334666 4962 generic.go:334] "Generic (PLEG): container finished" podID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerID="eb0e72879a25aa9037cdc6292f11db919678280642193cd3d3aba5fbe6ccd606" exitCode=0 Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.335045 4962 generic.go:334] "Generic (PLEG): container finished" podID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerID="eadd6481e402308012bf3cb666163e04ad34b5f97c3780834cf912ecffa0bc84" exitCode=0 Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.334728 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerDied","Data":"eb0e72879a25aa9037cdc6292f11db919678280642193cd3d3aba5fbe6ccd606"} Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.335108 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerDied","Data":"eadd6481e402308012bf3cb666163e04ad34b5f97c3780834cf912ecffa0bc84"} Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.338222 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerStarted","Data":"42f878706ae1a7e2114a67d56b43328ffc07645b6f77f8f9d20b6c4a2aec6632"} Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.338493 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.338512 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.367681 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5b685f5b9-4db6w" podStartSLOduration=2.367445644 podStartE2EDuration="2.367445644s" podCreationTimestamp="2026-02-20 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:25.363074467 +0000 UTC m=+1216.945546323" watchObservedRunningTime="2026-02-20 10:15:25.367445644 +0000 UTC m=+1216.949917480" Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.402463 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.404650 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-log" containerID="cri-o://eee28e4c70ffa00bd2365deec6d15fc3972d99a3f4797b06e78feaf11cf564f9" gracePeriod=30 Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.404684 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-httpd" containerID="cri-o://d20fc2b7dd54adff0d815d504246ad4e77027f8694190be99bf82bc96b1f4c9f" gracePeriod=30 Feb 20 10:15:26 crc kubenswrapper[4962]: I0220 10:15:26.366405 4962 generic.go:334] "Generic (PLEG): container finished" podID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerID="eee28e4c70ffa00bd2365deec6d15fc3972d99a3f4797b06e78feaf11cf564f9" exitCode=143 Feb 20 10:15:26 crc kubenswrapper[4962]: I0220 10:15:26.366691 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerDied","Data":"eee28e4c70ffa00bd2365deec6d15fc3972d99a3f4797b06e78feaf11cf564f9"} Feb 20 10:15:27 crc kubenswrapper[4962]: I0220 10:15:27.395379 4962 generic.go:334] "Generic (PLEG): container finished" podID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerID="884bc1c8181b4c262029886834ab81b7b4d9ec6c2cb240fa256637239957413f" exitCode=0 Feb 20 10:15:27 crc kubenswrapper[4962]: I0220 10:15:27.396670 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerDied","Data":"884bc1c8181b4c262029886834ab81b7b4d9ec6c2cb240fa256637239957413f"} Feb 20 10:15:27 crc kubenswrapper[4962]: I0220 10:15:27.410540 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:27 crc kubenswrapper[4962]: I0220 10:15:27.410901 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-log" containerID="cri-o://888c0df412b752895b61127294d75f746f54944c18df1a5600dd20d1b268288d" gracePeriod=30 Feb 20 10:15:27 crc kubenswrapper[4962]: I0220 10:15:27.411098 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-httpd" containerID="cri-o://ed6b084d34e4657f5b3865a48d0866534c9fe6dd73d3021c88e80799cfa08dc0" gracePeriod=30 Feb 20 10:15:28 crc kubenswrapper[4962]: I0220 10:15:28.409700 4962 generic.go:334] "Generic (PLEG): container finished" podID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerID="888c0df412b752895b61127294d75f746f54944c18df1a5600dd20d1b268288d" exitCode=143 Feb 20 10:15:28 crc kubenswrapper[4962]: I0220 10:15:28.410542 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerDied","Data":"888c0df412b752895b61127294d75f746f54944c18df1a5600dd20d1b268288d"} Feb 20 10:15:29 crc kubenswrapper[4962]: I0220 10:15:29.423644 4962 generic.go:334] "Generic (PLEG): container finished" podID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerID="d20fc2b7dd54adff0d815d504246ad4e77027f8694190be99bf82bc96b1f4c9f" exitCode=0 Feb 20 10:15:29 crc kubenswrapper[4962]: I0220 10:15:29.423810 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerDied","Data":"d20fc2b7dd54adff0d815d504246ad4e77027f8694190be99bf82bc96b1f4c9f"} Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.474552 4962 generic.go:334] "Generic (PLEG): container finished" podID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerID="ed6b084d34e4657f5b3865a48d0866534c9fe6dd73d3021c88e80799cfa08dc0" exitCode=0 Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.474881 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerDied","Data":"ed6b084d34e4657f5b3865a48d0866534c9fe6dd73d3021c88e80799cfa08dc0"} Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.584454 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.744560 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg88s\" (UniqueName: \"kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.749829 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.749873 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.749929 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.750117 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.750201 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.750237 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.751016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.751267 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.751312 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.753187 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s" (OuterVolumeSpecName: "kube-api-access-mg88s") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "kube-api-access-mg88s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.755653 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts" (OuterVolumeSpecName: "scripts") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.777650 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.792292 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.850737 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.853750 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smt92\" (UniqueName: \"kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.853910 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.854680 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.854696 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg88s\" (UniqueName: \"kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.854707 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.854718 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.859563 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.860504 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.860748 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92" (OuterVolumeSpecName: "kube-api-access-smt92") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "kube-api-access-smt92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.950438 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data" (OuterVolumeSpecName: "config-data") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.955676 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.955835 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.955932 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.955997 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956077 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956113 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956196 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956230 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956279 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956309 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956330 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956364 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs" (OuterVolumeSpecName: "logs") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956878 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956981 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs" (OuterVolumeSpecName: "logs") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957580 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957615 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957625 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957636 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smt92\" (UniqueName: \"kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957649 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957671 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957680 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.963682 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts" (OuterVolumeSpecName: "scripts") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.964434 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.972734 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts" (OuterVolumeSpecName: "scripts") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.990136 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.001172 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.032579 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.036043 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data" (OuterVolumeSpecName: "config-data") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.036289 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.047050 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.050422 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data" (OuterVolumeSpecName: "config-data") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.059578 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqshj\" (UniqueName: \"kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.059735 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060478 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060506 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060519 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060532 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060543 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060552 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060566 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060578 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060604 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060615 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.063235 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj" (OuterVolumeSpecName: "kube-api-access-xqshj") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "kube-api-access-xqshj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.063537 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.164927 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqshj\" (UniqueName: \"kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.164993 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.186530 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.267852 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.487964 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"755ca463-8c62-402c-8a88-a066fb38b521","Type":"ContainerStarted","Data":"58314faa8bcfe5f5f7afbcc99e392370d5f2737c5567814db10eda41512d6621"} Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.491304 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerDied","Data":"80442f73d0bb0b1518e80cca1be32921f22e35f7b8a9442cfe4b3a67ae521feb"} Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.491356 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.491383 4962 scope.go:117] "RemoveContainer" containerID="d20fc2b7dd54adff0d815d504246ad4e77027f8694190be99bf82bc96b1f4c9f" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.496280 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerDied","Data":"9040b55e8e99f198b86e6b7541c702ac840e9ec2debbf8648bac866cfdc48248"} Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.496356 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.500674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerDied","Data":"26bc9af9660938fb18d75442b06d47f88d7b3c3c743cee58b4e39774f040ef5e"} Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.500883 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.510904 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.004548925 podStartE2EDuration="13.510880366s" podCreationTimestamp="2026-02-20 10:15:19 +0000 UTC" firstStartedPulling="2026-02-20 10:15:20.898174433 +0000 UTC m=+1212.480646279" lastFinishedPulling="2026-02-20 10:15:31.404505874 +0000 UTC m=+1222.986977720" observedRunningTime="2026-02-20 10:15:32.50811539 +0000 UTC m=+1224.090587236" watchObservedRunningTime="2026-02-20 10:15:32.510880366 +0000 UTC m=+1224.093352232" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.533433 4962 scope.go:117] "RemoveContainer" containerID="eee28e4c70ffa00bd2365deec6d15fc3972d99a3f4797b06e78feaf11cf564f9" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.566988 4962 scope.go:117] "RemoveContainer" containerID="ed6b084d34e4657f5b3865a48d0866534c9fe6dd73d3021c88e80799cfa08dc0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.576070 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.593497 4962 scope.go:117] "RemoveContainer" containerID="888c0df412b752895b61127294d75f746f54944c18df1a5600dd20d1b268288d" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.595906 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.611353 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.623898 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.631892 4962 scope.go:117] "RemoveContainer" containerID="eb0e72879a25aa9037cdc6292f11db919678280642193cd3d3aba5fbe6ccd606" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.634776 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.655793 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.658952 4962 scope.go:117] "RemoveContainer" containerID="3dfd1ca1a6fa866b05ee6afe9233f010951bb6a3a4b6711a05320da1e6e882be" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666153 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666708 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="sg-core" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666725 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="sg-core" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666737 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666744 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666756 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666764 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666784 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-central-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666790 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-central-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666819 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666824 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666831 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-notification-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666839 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-notification-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666856 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="proxy-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666861 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="proxy-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666873 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666881 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667146 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-central-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667167 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667178 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-notification-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667191 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="sg-core" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667200 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667214 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="proxy-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667223 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667230 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.668656 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.669924 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.671126 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.671448 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lzhn7" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.671766 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.675935 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.683072 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.686178 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.686362 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.691194 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.691716 4962 scope.go:117] "RemoveContainer" containerID="884bc1c8181b4c262029886834ab81b7b4d9ec6c2cb240fa256637239957413f" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.710761 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.731763 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.733537 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.736821 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.737058 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.747654 4962 scope.go:117] "RemoveContainer" containerID="eadd6481e402308012bf3cb666163e04ad34b5f97c3780834cf912ecffa0bc84" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.756807 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.778919 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.778975 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779008 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779031 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779060 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779101 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779118 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779147 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779168 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779191 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779224 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779243 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779258 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779275 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtlzc\" (UniqueName: \"kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779298 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.881986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882078 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882138 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882160 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882188 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882212 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882804 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882830 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqv6c\" (UniqueName: \"kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882850 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882883 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882905 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882920 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882937 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtlzc\" (UniqueName: \"kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882978 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883022 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883058 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883073 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883101 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883123 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883151 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883175 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882762 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.884349 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.885011 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.885259 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.889424 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.889798 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.890008 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.890179 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.892395 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.899942 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.904588 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtlzc\" (UniqueName: \"kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.910094 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.911338 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.914996 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.923583 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985658 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985735 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985770 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985791 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985913 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985934 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqv6c\" (UniqueName: \"kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985965 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.987568 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.988151 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.990434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.990896 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.994276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.001502 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.001661 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.002228 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.009221 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.011249 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqv6c\" (UniqueName: \"kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.029785 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.056110 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.162123 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" path="/var/lib/kubelet/pods/18d02cdb-5de5-457e-9f17-1cc3ba51ca55/volumes" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.163463 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" path="/var/lib/kubelet/pods/5d49a0b0-18e1-4701-9a94-5ff22700ffdf/volumes" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.165756 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" path="/var/lib/kubelet/pods/eb8bed08-cb47-42cb-a192-2545a14e4c4b/volumes" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.562651 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.564816 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.696112 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.716581 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.843771 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.529282 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerStarted","Data":"a7c3f6bf061e2f58df1199abfaabc0fa7edc0079e61af3f51614ef7b77cc0b31"} Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.550606 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerStarted","Data":"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75"} Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.550674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerStarted","Data":"56a6ee57a73de30149034b6e88679c3be01baa1f232bfba996e0533713b1689a"} Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.555576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerStarted","Data":"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b"} Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.555650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerStarted","Data":"256cfc6edb7fdfbe31dd4d739c6bcf21323de33dda20f71407beaea0eb6fd7bc"} Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.668302 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.567205 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerStarted","Data":"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f"} Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.569676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerStarted","Data":"fd4f315997ddf00a356a9ec5e5c2864b8fa25408200a3b8ba03172b2cebc87ed"} Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.569732 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerStarted","Data":"2063db6c0681c99c5af22bd280759565fe6f153460080e6e822a7af9e9e7ff12"} Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.571931 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerStarted","Data":"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9"} Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.597373 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.5973470499999998 podStartE2EDuration="3.59734705s" podCreationTimestamp="2026-02-20 10:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:35.595147221 +0000 UTC m=+1227.177619067" watchObservedRunningTime="2026-02-20 10:15:35.59734705 +0000 UTC m=+1227.179818896" Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.623276 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.623249755 podStartE2EDuration="3.623249755s" podCreationTimestamp="2026-02-20 10:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:35.620934682 +0000 UTC m=+1227.203406528" watchObservedRunningTime="2026-02-20 10:15:35.623249755 +0000 UTC m=+1227.205721601" Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.161173 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.281666 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.281972 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-ffdf447d4-qtmvr" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-api" containerID="cri-o://859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac" gracePeriod=30 Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.282065 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-ffdf447d4-qtmvr" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-httpd" containerID="cri-o://58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574" gracePeriod=30 Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.583231 4962 generic.go:334] "Generic (PLEG): container finished" podID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerID="58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574" exitCode=0 Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.583296 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerDied","Data":"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574"} Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.585551 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerStarted","Data":"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57"} Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.633511 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerStarted","Data":"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70"} Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.634312 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="sg-core" containerID="cri-o://13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57" gracePeriod=30 Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.634436 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.634117 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-central-agent" containerID="cri-o://1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75" gracePeriod=30 Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.634254 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="proxy-httpd" containerID="cri-o://fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70" gracePeriod=30 Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.634347 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-notification-agent" containerID="cri-o://a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9" gracePeriod=30 Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.698691 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.94729775 podStartE2EDuration="8.698658828s" podCreationTimestamp="2026-02-20 10:15:32 +0000 UTC" firstStartedPulling="2026-02-20 10:15:33.724132722 +0000 UTC m=+1225.306604568" lastFinishedPulling="2026-02-20 10:15:39.4754938 +0000 UTC m=+1231.057965646" observedRunningTime="2026-02-20 10:15:40.682701716 +0000 UTC m=+1232.265173592" watchObservedRunningTime="2026-02-20 10:15:40.698658828 +0000 UTC m=+1232.281130684" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.494981 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.627715 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs\") pod \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.627935 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle\") pod \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.628130 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config\") pod \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.628254 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q54gm\" (UniqueName: \"kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm\") pod \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.628296 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config\") pod \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.638731 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" (UID: "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.641776 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm" (OuterVolumeSpecName: "kube-api-access-q54gm") pod "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" (UID: "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58"). InnerVolumeSpecName "kube-api-access-q54gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651229 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerID="fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70" exitCode=0 Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651274 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerID="13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57" exitCode=2 Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651291 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerID="a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9" exitCode=0 Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651317 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerDied","Data":"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70"} Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651428 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerDied","Data":"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57"} Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651466 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerDied","Data":"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9"} Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.654110 4962 generic.go:334] "Generic (PLEG): container finished" podID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerID="859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac" exitCode=0 Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.654163 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerDied","Data":"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac"} Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.654198 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerDied","Data":"81c44508e7e551a0ec9263f4a7d0314158cbc47cdfa61ceff1466d2aef98334e"} Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.654165 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.654268 4962 scope.go:117] "RemoveContainer" containerID="58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.689782 4962 scope.go:117] "RemoveContainer" containerID="859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.701011 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config" (OuterVolumeSpecName: "config") pod "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" (UID: "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.702583 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" (UID: "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.727682 4962 scope.go:117] "RemoveContainer" containerID="58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574" Feb 20 10:15:41 crc kubenswrapper[4962]: E0220 10:15:41.728584 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574\": container with ID starting with 58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574 not found: ID does not exist" containerID="58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.728802 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574"} err="failed to get container status \"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574\": rpc error: code = NotFound desc = could not find container \"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574\": container with ID starting with 58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574 not found: ID does not exist" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.728926 4962 scope.go:117] "RemoveContainer" containerID="859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac" Feb 20 10:15:41 crc kubenswrapper[4962]: E0220 10:15:41.731082 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac\": container with ID starting with 859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac not found: ID does not exist" containerID="859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.731147 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac"} err="failed to get container status \"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac\": rpc error: code = NotFound desc = could not find container \"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac\": container with ID starting with 859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac not found: ID does not exist" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.731718 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.731761 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.731776 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q54gm\" (UniqueName: \"kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.731791 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.756658 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" (UID: "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.833486 4962 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.013841 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.027164 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.446410 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.451720 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.451902 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.451984 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452025 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452082 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452124 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtlzc\" (UniqueName: \"kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452212 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452775 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452948 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.458747 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts" (OuterVolumeSpecName: "scripts") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.461339 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc" (OuterVolumeSpecName: "kube-api-access-xtlzc") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "kube-api-access-xtlzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.530874 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.554699 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.554733 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.554746 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.554757 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtlzc\" (UniqueName: \"kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.554767 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.584530 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data" (OuterVolumeSpecName: "config-data") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.590536 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.656074 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.656114 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.673204 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerID="1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75" exitCode=0 Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.673259 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerDied","Data":"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75"} Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.673334 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerDied","Data":"56a6ee57a73de30149034b6e88679c3be01baa1f232bfba996e0533713b1689a"} Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.673358 4962 scope.go:117] "RemoveContainer" containerID="fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.673423 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.693426 4962 scope.go:117] "RemoveContainer" containerID="13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.720259 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.727050 4962 scope.go:117] "RemoveContainer" containerID="a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.734660 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.753018 4962 scope.go:117] "RemoveContainer" containerID="1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.763761 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764413 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="proxy-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764443 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="proxy-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764463 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="sg-core" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764474 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="sg-core" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764504 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-api" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764515 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-api" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764526 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764538 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764554 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-central-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764564 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-central-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764582 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-notification-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764607 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-notification-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.765912 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-central-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.765966 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="proxy-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.765978 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="sg-core" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.765993 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-notification-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.766009 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.766025 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-api" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.768459 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.771384 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.771473 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.778916 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.830467 4962 scope.go:117] "RemoveContainer" containerID="fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.831230 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70\": container with ID starting with fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70 not found: ID does not exist" containerID="fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.831283 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70"} err="failed to get container status \"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70\": rpc error: code = NotFound desc = could not find container \"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70\": container with ID starting with fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70 not found: ID does not exist" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.831321 4962 scope.go:117] "RemoveContainer" containerID="13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.832200 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57\": container with ID starting with 13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57 not found: ID does not exist" containerID="13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.832224 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57"} err="failed to get container status \"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57\": rpc error: code = NotFound desc = could not find container \"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57\": container with ID starting with 13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57 not found: ID does not exist" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.832240 4962 scope.go:117] "RemoveContainer" containerID="a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.832967 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9\": container with ID starting with a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9 not found: ID does not exist" containerID="a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.833007 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9"} err="failed to get container status \"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9\": rpc error: code = NotFound desc = could not find container \"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9\": container with ID starting with a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9 not found: ID does not exist" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.833037 4962 scope.go:117] "RemoveContainer" containerID="1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.833366 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75\": container with ID starting with 1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75 not found: ID does not exist" containerID="1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.833417 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75"} err="failed to get container status \"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75\": rpc error: code = NotFound desc = could not find container \"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75\": container with ID starting with 1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75 not found: ID does not exist" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861174 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861298 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861345 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7px4x\" (UniqueName: \"kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861424 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861480 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861606 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861631 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.963529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.964001 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.964528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.964635 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.965185 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.965232 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7px4x\" (UniqueName: \"kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.965316 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.965350 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.965680 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.969111 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.970070 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.970974 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.983778 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.988867 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7px4x\" (UniqueName: \"kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.002615 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.002702 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.056952 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.057419 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.057476 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.068881 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.103783 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.119332 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.124100 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.183629 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" path="/var/lib/kubelet/pods/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58/volumes" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.184286 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" path="/var/lib/kubelet/pods/ff5064c2-d8a3-41f3-8d14-8794be8126e1/volumes" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.623057 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.688288 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerStarted","Data":"a3529eb7b3b629ef3c3b91bbe1d433d262412a55d8bda6f23a58ec282b63369e"} Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.688829 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.688873 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.688884 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.688894 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 10:15:44 crc kubenswrapper[4962]: I0220 10:15:44.698496 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerStarted","Data":"28df8a32fe5a1bd334afa755bb83b0ac292979f42bd8a6975cdf978af2b8b6b7"} Feb 20 10:15:44 crc kubenswrapper[4962]: I0220 10:15:44.951087 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.000975 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.077938 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.078582 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-755cb8b5f4-zlzbb" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-log" containerID="cri-o://c4560e14774e3c9741c91f46ea630363e7cc5935a06c720a5d083bca786e716f" gracePeriod=30 Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.078787 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-755cb8b5f4-zlzbb" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-api" containerID="cri-o://bfe2a2311075991b6e26f61913d5319a6a3da98a5127862535ec8779ac2e9fce" gracePeriod=30 Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.293721 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9xxwl"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.321146 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.379777 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9xxwl"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.383018 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e96c-account-create-update-zd8bf"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.384178 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.389078 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.434260 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjtsg\" (UniqueName: \"kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.434353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.442663 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e96c-account-create-update-zd8bf"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.470857 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tbn8g"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.472833 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.514875 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tbn8g"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538182 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcjqp\" (UniqueName: \"kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538210 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538312 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjtsg\" (UniqueName: \"kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538354 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6njl\" (UniqueName: \"kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.542135 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.564181 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjtsg\" (UniqueName: \"kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.589855 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-xnwmz"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.591578 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.615534 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7729-account-create-update-dttxs"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.619675 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.629073 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xnwmz"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.632886 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.645370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6njl\" (UniqueName: \"kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.645444 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.645513 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcjqp\" (UniqueName: \"kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.645535 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.646475 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.646745 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.656657 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7729-account-create-update-dttxs"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.667223 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6njl\" (UniqueName: \"kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.668415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcjqp\" (UniqueName: \"kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.717263 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.738024 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.747374 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8c5\" (UniqueName: \"kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.747437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.747474 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.747525 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75wzt\" (UniqueName: \"kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.751307 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-a33d-account-create-update-6q8g4"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.752717 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.763624 4962 generic.go:334] "Generic (PLEG): container finished" podID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerID="c4560e14774e3c9741c91f46ea630363e7cc5935a06c720a5d083bca786e716f" exitCode=143 Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.763747 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerDied","Data":"c4560e14774e3c9741c91f46ea630363e7cc5935a06c720a5d083bca786e716f"} Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.765460 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.774221 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a33d-account-create-update-6q8g4"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.782207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerStarted","Data":"b8af66136e35fc73b3d51b7e67a7d05bebe4ecd8a5ad20c914388c6152b5d470"} Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.849860 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.849947 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75wzt\" (UniqueName: \"kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.850143 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8c5\" (UniqueName: \"kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.850183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkfp6\" (UniqueName: \"kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.850214 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.850248 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.851581 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.853361 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.856409 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.871168 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75wzt\" (UniqueName: \"kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.878107 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8c5\" (UniqueName: \"kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.953075 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkfp6\" (UniqueName: \"kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.953172 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.957417 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.976264 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.990483 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkfp6\" (UniqueName: \"kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.997726 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.285056 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.350223 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9xxwl"] Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.535955 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e96c-account-create-update-zd8bf"] Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.605522 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tbn8g"] Feb 20 10:15:46 crc kubenswrapper[4962]: W0220 10:15:46.615925 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f853840_0af1_40ee_b11b_a0a62f9f4ebf.slice/crio-a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f WatchSource:0}: Error finding container a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f: Status 404 returned error can't find the container with id a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.781694 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7729-account-create-update-dttxs"] Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.798751 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e96c-account-create-update-zd8bf" event={"ID":"84f50d98-6178-44d4-8ac4-43a8df4e3339","Type":"ContainerStarted","Data":"cca63b6df77c4f05f324336e287d723bbb4b8475a0e294b3803987c9653e4132"} Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.801046 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tbn8g" event={"ID":"1f853840-0af1-40ee-b11b-a0a62f9f4ebf","Type":"ContainerStarted","Data":"a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f"} Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.802925 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xxwl" event={"ID":"7da93993-8b14-45f6-8d0b-8366becc762e","Type":"ContainerStarted","Data":"d5939f243f85e996e6d1902bb72680f0a5c1df9ab42c709cd744434161fb2db0"} Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.802953 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xxwl" event={"ID":"7da93993-8b14-45f6-8d0b-8366becc762e","Type":"ContainerStarted","Data":"bc0814885963d26026e551a006a093acd6a52246d2b26709ab56c1be71ccdd20"} Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.809620 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerStarted","Data":"bc2106ad3cc4af20a5e2c1213babb01b766f8accacbdaf4870b68d6cbc722d49"} Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.879268 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-9xxwl" podStartSLOduration=1.879237405 podStartE2EDuration="1.879237405s" podCreationTimestamp="2026-02-20 10:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:46.832222558 +0000 UTC m=+1238.414694424" watchObservedRunningTime="2026-02-20 10:15:46.879237405 +0000 UTC m=+1238.461709251" Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.889154 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xnwmz"] Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.912020 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.912176 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.070639 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a33d-account-create-update-6q8g4"] Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.117381 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.117534 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.129281 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.139464 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.822213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" event={"ID":"032f830f-9636-4783-a048-00f9b7b22a3a","Type":"ContainerStarted","Data":"7230277cc1eb3909a3d3342c6f5ba88bcf14bbf39fe46da73616efba87702b09"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.822483 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" event={"ID":"032f830f-9636-4783-a048-00f9b7b22a3a","Type":"ContainerStarted","Data":"e4727e33fbaf4f2fd6f0e77e50474250b81bce0793abe10591d440e3d3f794f9"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.824834 4962 generic.go:334] "Generic (PLEG): container finished" podID="84f50d98-6178-44d4-8ac4-43a8df4e3339" containerID="e1788ed30c723d96dcb6e0f9484b28a97145a65cc9e3bff73edd5bbbf2ff0b13" exitCode=0 Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.824884 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e96c-account-create-update-zd8bf" event={"ID":"84f50d98-6178-44d4-8ac4-43a8df4e3339","Type":"ContainerDied","Data":"e1788ed30c723d96dcb6e0f9484b28a97145a65cc9e3bff73edd5bbbf2ff0b13"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.826400 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xnwmz" event={"ID":"79394db3-1fa2-4b8f-927a-1cf8085f1df4","Type":"ContainerStarted","Data":"793db344d89e9339466a1f19a2e137b204724f58c385b41b9c74536f0d99e12b"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.826427 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xnwmz" event={"ID":"79394db3-1fa2-4b8f-927a-1cf8085f1df4","Type":"ContainerStarted","Data":"dd7896a5012e73b8ddeadf1951baeb976b785e72dfec1740274bc3a01d4e93d0"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.828008 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tbn8g" event={"ID":"1f853840-0af1-40ee-b11b-a0a62f9f4ebf","Type":"ContainerStarted","Data":"e3155d74dd6282e1fc794d27b2b712bbcb47529b5f2fbb3e8f768bc271110d45"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.832402 4962 generic.go:334] "Generic (PLEG): container finished" podID="7da93993-8b14-45f6-8d0b-8366becc762e" containerID="d5939f243f85e996e6d1902bb72680f0a5c1df9ab42c709cd744434161fb2db0" exitCode=0 Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.832455 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xxwl" event={"ID":"7da93993-8b14-45f6-8d0b-8366becc762e","Type":"ContainerDied","Data":"d5939f243f85e996e6d1902bb72680f0a5c1df9ab42c709cd744434161fb2db0"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.836031 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerStarted","Data":"0c7306cb64431bbfbfccbec9d4784b736bd29c8703a60d357986ec36fd19a276"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.836888 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.842582 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7729-account-create-update-dttxs" event={"ID":"85565888-6622-4dfc-9198-8e9c5b05cc75","Type":"ContainerStarted","Data":"e14d4499aad39130d8942043e6328de4fbc415b007670a0434fde9be884215b2"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.842638 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7729-account-create-update-dttxs" event={"ID":"85565888-6622-4dfc-9198-8e9c5b05cc75","Type":"ContainerStarted","Data":"d4010cdeffdef09dabf121775c79c0d5d454ac777c49cc0bb7a0999ba9e9cc4c"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.846370 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" podStartSLOduration=2.846355096 podStartE2EDuration="2.846355096s" podCreationTimestamp="2026-02-20 10:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:47.843566681 +0000 UTC m=+1239.426038527" watchObservedRunningTime="2026-02-20 10:15:47.846355096 +0000 UTC m=+1239.428826932" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.882730 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-7729-account-create-update-dttxs" podStartSLOduration=2.882709116 podStartE2EDuration="2.882709116s" podCreationTimestamp="2026-02-20 10:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:47.859552454 +0000 UTC m=+1239.442024300" watchObservedRunningTime="2026-02-20 10:15:47.882709116 +0000 UTC m=+1239.465180962" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.926574 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-tbn8g" podStartSLOduration=2.926548517 podStartE2EDuration="2.926548517s" podCreationTimestamp="2026-02-20 10:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:47.915990952 +0000 UTC m=+1239.498462798" watchObservedRunningTime="2026-02-20 10:15:47.926548517 +0000 UTC m=+1239.509020363" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.941480 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.533466568 podStartE2EDuration="5.941451435s" podCreationTimestamp="2026-02-20 10:15:42 +0000 UTC" firstStartedPulling="2026-02-20 10:15:43.633561896 +0000 UTC m=+1235.216033742" lastFinishedPulling="2026-02-20 10:15:47.041546773 +0000 UTC m=+1238.624018609" observedRunningTime="2026-02-20 10:15:47.936892605 +0000 UTC m=+1239.519364461" watchObservedRunningTime="2026-02-20 10:15:47.941451435 +0000 UTC m=+1239.523923301" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.960285 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-xnwmz" podStartSLOduration=2.960260815 podStartE2EDuration="2.960260815s" podCreationTimestamp="2026-02-20 10:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:47.955064295 +0000 UTC m=+1239.537536141" watchObservedRunningTime="2026-02-20 10:15:47.960260815 +0000 UTC m=+1239.542732661" Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.091933 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.850585 4962 generic.go:334] "Generic (PLEG): container finished" podID="85565888-6622-4dfc-9198-8e9c5b05cc75" containerID="e14d4499aad39130d8942043e6328de4fbc415b007670a0434fde9be884215b2" exitCode=0 Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.850790 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7729-account-create-update-dttxs" event={"ID":"85565888-6622-4dfc-9198-8e9c5b05cc75","Type":"ContainerDied","Data":"e14d4499aad39130d8942043e6328de4fbc415b007670a0434fde9be884215b2"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.857237 4962 generic.go:334] "Generic (PLEG): container finished" podID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerID="bfe2a2311075991b6e26f61913d5319a6a3da98a5127862535ec8779ac2e9fce" exitCode=0 Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.857320 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerDied","Data":"bfe2a2311075991b6e26f61913d5319a6a3da98a5127862535ec8779ac2e9fce"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.857367 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerDied","Data":"278c9072e567ac676f1ff447db5bfcb24f5eba477a61436baf57c6f5bf95aba9"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.857382 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="278c9072e567ac676f1ff447db5bfcb24f5eba477a61436baf57c6f5bf95aba9" Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.858572 4962 generic.go:334] "Generic (PLEG): container finished" podID="032f830f-9636-4783-a048-00f9b7b22a3a" containerID="7230277cc1eb3909a3d3342c6f5ba88bcf14bbf39fe46da73616efba87702b09" exitCode=0 Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.858636 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" event={"ID":"032f830f-9636-4783-a048-00f9b7b22a3a","Type":"ContainerDied","Data":"7230277cc1eb3909a3d3342c6f5ba88bcf14bbf39fe46da73616efba87702b09"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.859734 4962 generic.go:334] "Generic (PLEG): container finished" podID="79394db3-1fa2-4b8f-927a-1cf8085f1df4" containerID="793db344d89e9339466a1f19a2e137b204724f58c385b41b9c74536f0d99e12b" exitCode=0 Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.859784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xnwmz" event={"ID":"79394db3-1fa2-4b8f-927a-1cf8085f1df4","Type":"ContainerDied","Data":"793db344d89e9339466a1f19a2e137b204724f58c385b41b9c74536f0d99e12b"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.860837 4962 generic.go:334] "Generic (PLEG): container finished" podID="1f853840-0af1-40ee-b11b-a0a62f9f4ebf" containerID="e3155d74dd6282e1fc794d27b2b712bbcb47529b5f2fbb3e8f768bc271110d45" exitCode=0 Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.861033 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tbn8g" event={"ID":"1f853840-0af1-40ee-b11b-a0a62f9f4ebf","Type":"ContainerDied","Data":"e3155d74dd6282e1fc794d27b2b712bbcb47529b5f2fbb3e8f768bc271110d45"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.904213 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051380 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051434 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051655 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051686 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051730 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051815 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051860 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch9vq\" (UniqueName: \"kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.058061 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs" (OuterVolumeSpecName: "logs") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.061285 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq" (OuterVolumeSpecName: "kube-api-access-ch9vq") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "kube-api-access-ch9vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.069737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts" (OuterVolumeSpecName: "scripts") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.224890 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.243711 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.244527 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data" (OuterVolumeSpecName: "config-data") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.245371 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.245631 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.245648 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch9vq\" (UniqueName: \"kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.348068 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.355714 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.355843 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.386888 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.413031 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.453522 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.453551 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.554419 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjtsg\" (UniqueName: \"kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg\") pod \"7da93993-8b14-45f6-8d0b-8366becc762e\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.554539 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts\") pod \"84f50d98-6178-44d4-8ac4-43a8df4e3339\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.554698 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6njl\" (UniqueName: \"kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl\") pod \"84f50d98-6178-44d4-8ac4-43a8df4e3339\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.555023 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts\") pod \"7da93993-8b14-45f6-8d0b-8366becc762e\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.555764 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84f50d98-6178-44d4-8ac4-43a8df4e3339" (UID: "84f50d98-6178-44d4-8ac4-43a8df4e3339"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.557563 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.558042 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7da93993-8b14-45f6-8d0b-8366becc762e" (UID: "7da93993-8b14-45f6-8d0b-8366becc762e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.558659 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg" (OuterVolumeSpecName: "kube-api-access-vjtsg") pod "7da93993-8b14-45f6-8d0b-8366becc762e" (UID: "7da93993-8b14-45f6-8d0b-8366becc762e"). InnerVolumeSpecName "kube-api-access-vjtsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.561945 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl" (OuterVolumeSpecName: "kube-api-access-n6njl") pod "84f50d98-6178-44d4-8ac4-43a8df4e3339" (UID: "84f50d98-6178-44d4-8ac4-43a8df4e3339"). InnerVolumeSpecName "kube-api-access-n6njl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.660438 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.660492 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjtsg\" (UniqueName: \"kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.660505 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6njl\" (UniqueName: \"kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.885220 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e96c-account-create-update-zd8bf" event={"ID":"84f50d98-6178-44d4-8ac4-43a8df4e3339","Type":"ContainerDied","Data":"cca63b6df77c4f05f324336e287d723bbb4b8475a0e294b3803987c9653e4132"} Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.885635 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cca63b6df77c4f05f324336e287d723bbb4b8475a0e294b3803987c9653e4132" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.885386 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.890635 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.890477 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xxwl" event={"ID":"7da93993-8b14-45f6-8d0b-8366becc762e","Type":"ContainerDied","Data":"bc0814885963d26026e551a006a093acd6a52246d2b26709ab56c1be71ccdd20"} Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.890722 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc0814885963d26026e551a006a093acd6a52246d2b26709ab56c1be71ccdd20" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.890638 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.892168 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-central-agent" containerID="cri-o://28df8a32fe5a1bd334afa755bb83b0ac292979f42bd8a6975cdf978af2b8b6b7" gracePeriod=30 Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.892337 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="sg-core" containerID="cri-o://bc2106ad3cc4af20a5e2c1213babb01b766f8accacbdaf4870b68d6cbc722d49" gracePeriod=30 Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.892387 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-notification-agent" containerID="cri-o://b8af66136e35fc73b3d51b7e67a7d05bebe4ecd8a5ad20c914388c6152b5d470" gracePeriod=30 Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.892438 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="proxy-httpd" containerID="cri-o://0c7306cb64431bbfbfccbec9d4784b736bd29c8703a60d357986ec36fd19a276" gracePeriod=30 Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.946832 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.963417 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.513651 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.588617 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts\") pod \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.588927 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs8c5\" (UniqueName: \"kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5\") pod \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.590409 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "79394db3-1fa2-4b8f-927a-1cf8085f1df4" (UID: "79394db3-1fa2-4b8f-927a-1cf8085f1df4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.598803 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5" (OuterVolumeSpecName: "kube-api-access-gs8c5") pod "79394db3-1fa2-4b8f-927a-1cf8085f1df4" (UID: "79394db3-1fa2-4b8f-927a-1cf8085f1df4"). InnerVolumeSpecName "kube-api-access-gs8c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.690954 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs8c5\" (UniqueName: \"kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.691000 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.708792 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.710879 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.751353 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.792444 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts\") pod \"032f830f-9636-4783-a048-00f9b7b22a3a\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.792534 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcjqp\" (UniqueName: \"kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp\") pod \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.792689 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkfp6\" (UniqueName: \"kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6\") pod \"032f830f-9636-4783-a048-00f9b7b22a3a\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.792933 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts\") pod \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.793904 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f853840-0af1-40ee-b11b-a0a62f9f4ebf" (UID: "1f853840-0af1-40ee-b11b-a0a62f9f4ebf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.795164 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "032f830f-9636-4783-a048-00f9b7b22a3a" (UID: "032f830f-9636-4783-a048-00f9b7b22a3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.799361 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6" (OuterVolumeSpecName: "kube-api-access-qkfp6") pod "032f830f-9636-4783-a048-00f9b7b22a3a" (UID: "032f830f-9636-4783-a048-00f9b7b22a3a"). InnerVolumeSpecName "kube-api-access-qkfp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.810810 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp" (OuterVolumeSpecName: "kube-api-access-tcjqp") pod "1f853840-0af1-40ee-b11b-a0a62f9f4ebf" (UID: "1f853840-0af1-40ee-b11b-a0a62f9f4ebf"). InnerVolumeSpecName "kube-api-access-tcjqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.895906 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75wzt\" (UniqueName: \"kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt\") pod \"85565888-6622-4dfc-9198-8e9c5b05cc75\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.896435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts\") pod \"85565888-6622-4dfc-9198-8e9c5b05cc75\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.897050 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.897064 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.897078 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcjqp\" (UniqueName: \"kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.897091 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkfp6\" (UniqueName: \"kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.897566 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85565888-6622-4dfc-9198-8e9c5b05cc75" (UID: "85565888-6622-4dfc-9198-8e9c5b05cc75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.903744 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt" (OuterVolumeSpecName: "kube-api-access-75wzt") pod "85565888-6622-4dfc-9198-8e9c5b05cc75" (UID: "85565888-6622-4dfc-9198-8e9c5b05cc75"). InnerVolumeSpecName "kube-api-access-75wzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.934888 4962 generic.go:334] "Generic (PLEG): container finished" podID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerID="0c7306cb64431bbfbfccbec9d4784b736bd29c8703a60d357986ec36fd19a276" exitCode=0 Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.934925 4962 generic.go:334] "Generic (PLEG): container finished" podID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerID="bc2106ad3cc4af20a5e2c1213babb01b766f8accacbdaf4870b68d6cbc722d49" exitCode=2 Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.934932 4962 generic.go:334] "Generic (PLEG): container finished" podID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerID="b8af66136e35fc73b3d51b7e67a7d05bebe4ecd8a5ad20c914388c6152b5d470" exitCode=0 Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.934982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerDied","Data":"0c7306cb64431bbfbfccbec9d4784b736bd29c8703a60d357986ec36fd19a276"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.935013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerDied","Data":"bc2106ad3cc4af20a5e2c1213babb01b766f8accacbdaf4870b68d6cbc722d49"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.935026 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerDied","Data":"b8af66136e35fc73b3d51b7e67a7d05bebe4ecd8a5ad20c914388c6152b5d470"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.939267 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7729-account-create-update-dttxs" event={"ID":"85565888-6622-4dfc-9198-8e9c5b05cc75","Type":"ContainerDied","Data":"d4010cdeffdef09dabf121775c79c0d5d454ac777c49cc0bb7a0999ba9e9cc4c"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.939295 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4010cdeffdef09dabf121775c79c0d5d454ac777c49cc0bb7a0999ba9e9cc4c" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.939352 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.950313 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" event={"ID":"032f830f-9636-4783-a048-00f9b7b22a3a","Type":"ContainerDied","Data":"e4727e33fbaf4f2fd6f0e77e50474250b81bce0793abe10591d440e3d3f794f9"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.950360 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4727e33fbaf4f2fd6f0e77e50474250b81bce0793abe10591d440e3d3f794f9" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.950448 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.956850 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.956780 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xnwmz" event={"ID":"79394db3-1fa2-4b8f-927a-1cf8085f1df4","Type":"ContainerDied","Data":"dd7896a5012e73b8ddeadf1951baeb976b785e72dfec1740274bc3a01d4e93d0"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.957552 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7896a5012e73b8ddeadf1951baeb976b785e72dfec1740274bc3a01d4e93d0" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.962473 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tbn8g" event={"ID":"1f853840-0af1-40ee-b11b-a0a62f9f4ebf","Type":"ContainerDied","Data":"a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.962520 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.962623 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:51 crc kubenswrapper[4962]: I0220 10:15:51.009089 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75wzt\" (UniqueName: \"kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:51 crc kubenswrapper[4962]: I0220 10:15:51.009145 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:51 crc kubenswrapper[4962]: I0220 10:15:51.152542 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" path="/var/lib/kubelet/pods/b1b02597-c246-43dc-bd85-bebc40c70abf/volumes" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.974425 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbq67"] Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.975912 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032f830f-9636-4783-a048-00f9b7b22a3a" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.975934 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="032f830f-9636-4783-a048-00f9b7b22a3a" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.975951 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85565888-6622-4dfc-9198-8e9c5b05cc75" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.975958 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="85565888-6622-4dfc-9198-8e9c5b05cc75" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.975988 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da93993-8b14-45f6-8d0b-8366becc762e" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.975997 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da93993-8b14-45f6-8d0b-8366becc762e" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.976009 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84f50d98-6178-44d4-8ac4-43a8df4e3339" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976020 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f50d98-6178-44d4-8ac4-43a8df4e3339" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.976035 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-log" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976045 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-log" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.976057 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f853840-0af1-40ee-b11b-a0a62f9f4ebf" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976065 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f853840-0af1-40ee-b11b-a0a62f9f4ebf" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.976084 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79394db3-1fa2-4b8f-927a-1cf8085f1df4" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976093 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="79394db3-1fa2-4b8f-927a-1cf8085f1df4" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.976108 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-api" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976116 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-api" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976356 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f853840-0af1-40ee-b11b-a0a62f9f4ebf" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976374 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-log" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976389 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-api" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976399 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="032f830f-9636-4783-a048-00f9b7b22a3a" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976412 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="85565888-6622-4dfc-9198-8e9c5b05cc75" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976431 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="79394db3-1fa2-4b8f-927a-1cf8085f1df4" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976441 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="84f50d98-6178-44d4-8ac4-43a8df4e3339" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976453 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da93993-8b14-45f6-8d0b-8366becc762e" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.977380 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.980571 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.980822 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mb5nf" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.986033 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.996282 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbq67"] Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.048602 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.048700 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.048740 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.048807 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5kqs\" (UniqueName: \"kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.149376 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.149436 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.149473 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.149558 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5kqs\" (UniqueName: \"kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.159196 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.159376 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.159936 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.171583 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5kqs\" (UniqueName: \"kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.296754 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: W0220 10:15:56.834039 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20663c25_09a7_4a31_9994_450f507d4ff1.slice/crio-2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8 WatchSource:0}: Error finding container 2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8: Status 404 returned error can't find the container with id 2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8 Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.834306 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbq67"] Feb 20 10:15:57 crc kubenswrapper[4962]: I0220 10:15:57.043100 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbq67" event={"ID":"20663c25-09a7-4a31-9994-450f507d4ff1","Type":"ContainerStarted","Data":"2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8"} Feb 20 10:16:02 crc kubenswrapper[4962]: I0220 10:16:02.188947 4962 generic.go:334] "Generic (PLEG): container finished" podID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerID="28df8a32fe5a1bd334afa755bb83b0ac292979f42bd8a6975cdf978af2b8b6b7" exitCode=0 Feb 20 10:16:02 crc kubenswrapper[4962]: I0220 10:16:02.189732 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerDied","Data":"28df8a32fe5a1bd334afa755bb83b0ac292979f42bd8a6975cdf978af2b8b6b7"} Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.235912 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerDied","Data":"a3529eb7b3b629ef3c3b91bbe1d433d262412a55d8bda6f23a58ec282b63369e"} Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.237134 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3529eb7b3b629ef3c3b91bbe1d433d262412a55d8bda6f23a58ec282b63369e" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.389491 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503351 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7px4x\" (UniqueName: \"kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503483 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503546 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503738 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503900 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503940 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503980 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.505329 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.505634 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.513357 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts" (OuterVolumeSpecName: "scripts") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.515795 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x" (OuterVolumeSpecName: "kube-api-access-7px4x") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "kube-api-access-7px4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.534131 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.588618 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606277 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606315 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606329 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606340 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7px4x\" (UniqueName: \"kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606355 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606363 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.628005 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data" (OuterVolumeSpecName: "config-data") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.708215 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.258227 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbq67" event={"ID":"20663c25-09a7-4a31-9994-450f507d4ff1","Type":"ContainerStarted","Data":"3cc79122882da35c12762f52d1de73bf1a9ef430f240e775b44faf92fe147dab"} Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.258313 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.290045 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wbq67" podStartSLOduration=2.875283514 podStartE2EDuration="12.290023844s" podCreationTimestamp="2026-02-20 10:15:55 +0000 UTC" firstStartedPulling="2026-02-20 10:15:56.840373314 +0000 UTC m=+1248.422845160" lastFinishedPulling="2026-02-20 10:16:06.255113604 +0000 UTC m=+1257.837585490" observedRunningTime="2026-02-20 10:16:07.28046589 +0000 UTC m=+1258.862937746" watchObservedRunningTime="2026-02-20 10:16:07.290023844 +0000 UTC m=+1258.872495700" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.310670 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.317549 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.351268 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:07 crc kubenswrapper[4962]: E0220 10:16:07.351815 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-notification-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.351835 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-notification-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: E0220 10:16:07.351858 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-central-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.351867 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-central-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: E0220 10:16:07.351891 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="sg-core" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.351897 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="sg-core" Feb 20 10:16:07 crc kubenswrapper[4962]: E0220 10:16:07.351922 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="proxy-httpd" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.351928 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="proxy-httpd" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.352147 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="proxy-httpd" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.352169 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-central-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.352186 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-notification-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.352198 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="sg-core" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.355767 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.360712 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.360776 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.382801 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532195 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg46p\" (UniqueName: \"kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532292 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532372 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532454 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532499 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532542 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.634744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.634845 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg46p\" (UniqueName: \"kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.634934 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.634989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.635206 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.635269 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.635314 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.636332 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.636481 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.645621 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.646263 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.657401 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.659775 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.662759 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg46p\" (UniqueName: \"kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.675645 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:08 crc kubenswrapper[4962]: W0220 10:16:08.263900 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod457c772c_a7b8_40ea_8573_c483915687be.slice/crio-beaabcfa9360010688b1a11e6bc4e4b1e737fa90ac38511bb0966d375d500981 WatchSource:0}: Error finding container beaabcfa9360010688b1a11e6bc4e4b1e737fa90ac38511bb0966d375d500981: Status 404 returned error can't find the container with id beaabcfa9360010688b1a11e6bc4e4b1e737fa90ac38511bb0966d375d500981 Feb 20 10:16:08 crc kubenswrapper[4962]: I0220 10:16:08.270006 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:09 crc kubenswrapper[4962]: I0220 10:16:09.155288 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" path="/var/lib/kubelet/pods/2263355d-2fa1-4b5a-bfc2-9f362df5739d/volumes" Feb 20 10:16:09 crc kubenswrapper[4962]: I0220 10:16:09.347337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerStarted","Data":"3a1c7b6152c78c256920cd6dc450ddbd613db243bb2b91fc78e1eaf94400c2d6"} Feb 20 10:16:09 crc kubenswrapper[4962]: I0220 10:16:09.349114 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerStarted","Data":"beaabcfa9360010688b1a11e6bc4e4b1e737fa90ac38511bb0966d375d500981"} Feb 20 10:16:10 crc kubenswrapper[4962]: I0220 10:16:10.365082 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerStarted","Data":"97e11a6c5e404ce1f4a50771ae7056ddfe4362a2679a8f06158ae96d84b4a250"} Feb 20 10:16:11 crc kubenswrapper[4962]: I0220 10:16:11.381497 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerStarted","Data":"e237fa9e8086988f9e78412b6b078a289e696808959ab9d107880a825fcf14c5"} Feb 20 10:16:11 crc kubenswrapper[4962]: I0220 10:16:11.508195 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:16:11 crc kubenswrapper[4962]: I0220 10:16:11.508642 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:16:12 crc kubenswrapper[4962]: I0220 10:16:12.396179 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerStarted","Data":"93b1b3489e3c8062e10a81096649fbd732605d1f78d868a69f146c92e37a74fa"} Feb 20 10:16:12 crc kubenswrapper[4962]: I0220 10:16:12.396838 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:16:12 crc kubenswrapper[4962]: I0220 10:16:12.439420 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.135795244 podStartE2EDuration="5.439395976s" podCreationTimestamp="2026-02-20 10:16:07 +0000 UTC" firstStartedPulling="2026-02-20 10:16:08.267859256 +0000 UTC m=+1259.850331142" lastFinishedPulling="2026-02-20 10:16:11.571460018 +0000 UTC m=+1263.153931874" observedRunningTime="2026-02-20 10:16:12.434568967 +0000 UTC m=+1264.017040823" watchObservedRunningTime="2026-02-20 10:16:12.439395976 +0000 UTC m=+1264.021867832" Feb 20 10:16:18 crc kubenswrapper[4962]: I0220 10:16:18.468326 4962 generic.go:334] "Generic (PLEG): container finished" podID="20663c25-09a7-4a31-9994-450f507d4ff1" containerID="3cc79122882da35c12762f52d1de73bf1a9ef430f240e775b44faf92fe147dab" exitCode=0 Feb 20 10:16:18 crc kubenswrapper[4962]: I0220 10:16:18.468411 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbq67" event={"ID":"20663c25-09a7-4a31-9994-450f507d4ff1","Type":"ContainerDied","Data":"3cc79122882da35c12762f52d1de73bf1a9ef430f240e775b44faf92fe147dab"} Feb 20 10:16:19 crc kubenswrapper[4962]: I0220 10:16:19.931343 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.081864 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data\") pod \"20663c25-09a7-4a31-9994-450f507d4ff1\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.081969 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle\") pod \"20663c25-09a7-4a31-9994-450f507d4ff1\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.082227 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5kqs\" (UniqueName: \"kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs\") pod \"20663c25-09a7-4a31-9994-450f507d4ff1\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.082302 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts\") pod \"20663c25-09a7-4a31-9994-450f507d4ff1\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.095202 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts" (OuterVolumeSpecName: "scripts") pod "20663c25-09a7-4a31-9994-450f507d4ff1" (UID: "20663c25-09a7-4a31-9994-450f507d4ff1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.095408 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs" (OuterVolumeSpecName: "kube-api-access-r5kqs") pod "20663c25-09a7-4a31-9994-450f507d4ff1" (UID: "20663c25-09a7-4a31-9994-450f507d4ff1"). InnerVolumeSpecName "kube-api-access-r5kqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.119308 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data" (OuterVolumeSpecName: "config-data") pod "20663c25-09a7-4a31-9994-450f507d4ff1" (UID: "20663c25-09a7-4a31-9994-450f507d4ff1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.139776 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20663c25-09a7-4a31-9994-450f507d4ff1" (UID: "20663c25-09a7-4a31-9994-450f507d4ff1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.185797 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.185854 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.185879 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5kqs\" (UniqueName: \"kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.185901 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.496129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbq67" event={"ID":"20663c25-09a7-4a31-9994-450f507d4ff1","Type":"ContainerDied","Data":"2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8"} Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.496184 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.496261 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.716988 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:16:20 crc kubenswrapper[4962]: E0220 10:16:20.717732 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20663c25-09a7-4a31-9994-450f507d4ff1" containerName="nova-cell0-conductor-db-sync" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.717752 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="20663c25-09a7-4a31-9994-450f507d4ff1" containerName="nova-cell0-conductor-db-sync" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.717987 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="20663c25-09a7-4a31-9994-450f507d4ff1" containerName="nova-cell0-conductor-db-sync" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.718878 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.727675 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.728056 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mb5nf" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.734171 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.901527 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.901706 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hql7h\" (UniqueName: \"kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.903158 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.006296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.006947 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hql7h\" (UniqueName: \"kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.007080 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.014641 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.025007 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.039061 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hql7h\" (UniqueName: \"kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.061821 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.608002 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:16:22 crc kubenswrapper[4962]: I0220 10:16:22.529022 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"815f0ef8-a30a-4467-bb56-ff8499a4be44","Type":"ContainerStarted","Data":"5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69"} Feb 20 10:16:22 crc kubenswrapper[4962]: I0220 10:16:22.529617 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"815f0ef8-a30a-4467-bb56-ff8499a4be44","Type":"ContainerStarted","Data":"5925bb54309b7a0a7036656c54ac3f8deef63680ce4f7825beb5965502489453"} Feb 20 10:16:22 crc kubenswrapper[4962]: I0220 10:16:22.530141 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:22 crc kubenswrapper[4962]: I0220 10:16:22.560211 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.5601851289999997 podStartE2EDuration="2.560185129s" podCreationTimestamp="2026-02-20 10:16:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:22.552012868 +0000 UTC m=+1274.134484724" watchObservedRunningTime="2026-02-20 10:16:22.560185129 +0000 UTC m=+1274.142656975" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.098309 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.788684 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-kjq4f"] Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.791068 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.796009 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.797935 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.811319 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjq4f"] Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.889903 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.889980 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.890038 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.890091 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxmdn\" (UniqueName: \"kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.988924 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.990347 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.992246 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.992308 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.992354 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxmdn\" (UniqueName: \"kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.992435 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.010096 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.011321 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.013557 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.015230 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.025674 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.034625 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxmdn\" (UniqueName: \"kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.079301 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.080700 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.090441 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.100381 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.106876 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.107180 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.107355 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8chs7\" (UniqueName: \"kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.123024 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.187162 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.189142 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.208964 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.225420 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244499 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244530 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244583 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244628 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26dw\" (UniqueName: \"kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244718 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8chs7\" (UniqueName: \"kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244762 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244818 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244865 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244903 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.252531 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.257481 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.259461 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.259553 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.274103 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.299161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8chs7\" (UniqueName: \"kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.342864 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349143 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349221 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349257 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349291 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349355 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349375 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s26dw\" (UniqueName: \"kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.370132 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.378102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.378155 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.378288 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.380043 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.384653 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.384837 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26dw\" (UniqueName: \"kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.395248 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.399355 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.419289 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.423902 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.435738 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.472778 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnccj\" (UniqueName: \"kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.472941 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.472967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.472992 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473080 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473100 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473164 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473201 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473266 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckgt6\" (UniqueName: \"kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576196 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576249 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576276 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576339 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576362 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576390 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576421 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576442 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckgt6\" (UniqueName: \"kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576497 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnccj\" (UniqueName: \"kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.578478 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.579038 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.579581 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.580031 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.587980 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.603250 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.605674 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.612296 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnccj\" (UniqueName: \"kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.621273 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckgt6\" (UniqueName: \"kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.625478 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.670304 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.713373 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.749088 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.105770 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjq4f"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.333081 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ct4qz"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.334953 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.341371 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.341639 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.396263 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ct4qz"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.417102 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:28 crc kubenswrapper[4962]: W0220 10:16:28.468171 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d60578e_e3d0_4ae9_8539_9dfd84ebf836.slice/crio-bc1cbee44de6a5e482a759f896e04c3479b2c9a764a90e45d0331d2a12691504 WatchSource:0}: Error finding container bc1cbee44de6a5e482a759f896e04c3479b2c9a764a90e45d0331d2a12691504: Status 404 returned error can't find the container with id bc1cbee44de6a5e482a759f896e04c3479b2c9a764a90e45d0331d2a12691504 Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.469422 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.506013 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.531127 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.531222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5662\" (UniqueName: \"kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.531396 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.531540 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.633155 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.633749 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.633786 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5662\" (UniqueName: \"kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.633820 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.648456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.648520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.648844 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.653823 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.658862 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5662\" (UniqueName: \"kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.689572 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.698246 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.731276 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjq4f" event={"ID":"28bfacb3-7247-41ad-bf30-47c81427487b","Type":"ContainerStarted","Data":"1eb9947e80af1012b6145dccb54cd11c0689239b2a15c94816fdca73015d8cfe"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.731344 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjq4f" event={"ID":"28bfacb3-7247-41ad-bf30-47c81427487b","Type":"ContainerStarted","Data":"0f0bc1171f843d6af2226a5d8d968c25ad3b7d5cd1cf52d106ae84ff241ffecd"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.732674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerStarted","Data":"9cb17bd0f831295fec6db28bdfcd5a1a3d6be987d43cd0f564f65a201a071dcf"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.733950 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea","Type":"ContainerStarted","Data":"503b6078fa852a67bd1c7bbebbc0925a0a08be36053fb3fee5407b0136117e50"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.735364 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1d60578e-e3d0-4ae9-8539-9dfd84ebf836","Type":"ContainerStarted","Data":"bc1cbee44de6a5e482a759f896e04c3479b2c9a764a90e45d0331d2a12691504"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.736661 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerStarted","Data":"a7e2d55de2a0689981aefd86aae03373f74c3435dacca5e0c076361b42cb5da4"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.738093 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" event={"ID":"619a1578-177c-476f-a471-e39ec43ebf20","Type":"ContainerStarted","Data":"323c0906415aaaf20c526ccc0a5760d7fccfd56336d524a538bc100ce5c3c6b2"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.752635 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-kjq4f" podStartSLOduration=2.7526164509999997 podStartE2EDuration="2.752616451s" podCreationTimestamp="2026-02-20 10:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:28.747514104 +0000 UTC m=+1280.329985950" watchObservedRunningTime="2026-02-20 10:16:28.752616451 +0000 UTC m=+1280.335088297" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.981895 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ct4qz"] Feb 20 10:16:29 crc kubenswrapper[4962]: I0220 10:16:29.754258 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" event={"ID":"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d","Type":"ContainerStarted","Data":"c43d694b0ea8172a2db698ac63ac57a6cb364529c6c87ffb777fc946029b6b2f"} Feb 20 10:16:29 crc kubenswrapper[4962]: I0220 10:16:29.754732 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" event={"ID":"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d","Type":"ContainerStarted","Data":"864d77909ed3ce2537874b3a198bcea7df1e3a117b50ccd248c61b191ba8d805"} Feb 20 10:16:29 crc kubenswrapper[4962]: I0220 10:16:29.794499 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" podStartSLOduration=1.7943583410000001 podStartE2EDuration="1.794358341s" podCreationTimestamp="2026-02-20 10:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:29.784346642 +0000 UTC m=+1281.366818488" watchObservedRunningTime="2026-02-20 10:16:29.794358341 +0000 UTC m=+1281.376830187" Feb 20 10:16:29 crc kubenswrapper[4962]: I0220 10:16:29.797225 4962 generic.go:334] "Generic (PLEG): container finished" podID="619a1578-177c-476f-a471-e39ec43ebf20" containerID="6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a" exitCode=0 Feb 20 10:16:29 crc kubenswrapper[4962]: I0220 10:16:29.798746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" event={"ID":"619a1578-177c-476f-a471-e39ec43ebf20","Type":"ContainerDied","Data":"6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a"} Feb 20 10:16:31 crc kubenswrapper[4962]: I0220 10:16:31.285308 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:31 crc kubenswrapper[4962]: I0220 10:16:31.293893 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.840805 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" event={"ID":"619a1578-177c-476f-a471-e39ec43ebf20","Type":"ContainerStarted","Data":"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858"} Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.841524 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.843152 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerStarted","Data":"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13"} Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.847668 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea","Type":"ContainerStarted","Data":"e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3"} Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.849640 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1d60578e-e3d0-4ae9-8539-9dfd84ebf836","Type":"ContainerStarted","Data":"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8"} Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.849895 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8" gracePeriod=30 Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.859204 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerStarted","Data":"7c4df354c2a2dd24abe490f4720cd2638f9d36b40005018cdb03e89ec0d07cff"} Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.894156 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" podStartSLOduration=5.8941290859999995 podStartE2EDuration="5.894129086s" podCreationTimestamp="2026-02-20 10:16:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:32.871891541 +0000 UTC m=+1284.454363387" watchObservedRunningTime="2026-02-20 10:16:32.894129086 +0000 UTC m=+1284.476600922" Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.894841 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.133593093 podStartE2EDuration="5.894836118s" podCreationTimestamp="2026-02-20 10:16:27 +0000 UTC" firstStartedPulling="2026-02-20 10:16:28.418517423 +0000 UTC m=+1280.000989269" lastFinishedPulling="2026-02-20 10:16:32.179760448 +0000 UTC m=+1283.762232294" observedRunningTime="2026-02-20 10:16:32.887883915 +0000 UTC m=+1284.470355761" watchObservedRunningTime="2026-02-20 10:16:32.894836118 +0000 UTC m=+1284.477307964" Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.918849 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.214600108 podStartE2EDuration="6.918824337s" podCreationTimestamp="2026-02-20 10:16:26 +0000 UTC" firstStartedPulling="2026-02-20 10:16:28.474665602 +0000 UTC m=+1280.057137448" lastFinishedPulling="2026-02-20 10:16:32.178889821 +0000 UTC m=+1283.761361677" observedRunningTime="2026-02-20 10:16:32.910565163 +0000 UTC m=+1284.493037009" watchObservedRunningTime="2026-02-20 10:16:32.918824337 +0000 UTC m=+1284.501296183" Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.881119 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerStarted","Data":"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66"} Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.881230 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-log" containerID="cri-o://398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" gracePeriod=30 Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.881316 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-metadata" containerID="cri-o://82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" gracePeriod=30 Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.894475 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerStarted","Data":"9a07dac43407609ca971ac0024f06768bd8740aa492e302e9d410517ce47da40"} Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.921193 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.241551942 podStartE2EDuration="6.921164434s" podCreationTimestamp="2026-02-20 10:16:27 +0000 UTC" firstStartedPulling="2026-02-20 10:16:28.4992947 +0000 UTC m=+1280.081766546" lastFinishedPulling="2026-02-20 10:16:32.178907192 +0000 UTC m=+1283.761379038" observedRunningTime="2026-02-20 10:16:33.90840371 +0000 UTC m=+1285.490875556" watchObservedRunningTime="2026-02-20 10:16:33.921164434 +0000 UTC m=+1285.503636280" Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.946112 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.475433414 podStartE2EDuration="6.946085741s" podCreationTimestamp="2026-02-20 10:16:27 +0000 UTC" firstStartedPulling="2026-02-20 10:16:28.70615241 +0000 UTC m=+1280.288624256" lastFinishedPulling="2026-02-20 10:16:32.176804727 +0000 UTC m=+1283.759276583" observedRunningTime="2026-02-20 10:16:33.941042056 +0000 UTC m=+1285.523513892" watchObservedRunningTime="2026-02-20 10:16:33.946085741 +0000 UTC m=+1285.528557587" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.514341 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.710397 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data\") pod \"ab0c66b4-1ce3-4594-8780-2effddad7043\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.710481 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs\") pod \"ab0c66b4-1ce3-4594-8780-2effddad7043\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.710599 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz\") pod \"ab0c66b4-1ce3-4594-8780-2effddad7043\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.710726 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle\") pod \"ab0c66b4-1ce3-4594-8780-2effddad7043\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.711497 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs" (OuterVolumeSpecName: "logs") pod "ab0c66b4-1ce3-4594-8780-2effddad7043" (UID: "ab0c66b4-1ce3-4594-8780-2effddad7043"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.729875 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz" (OuterVolumeSpecName: "kube-api-access-jjcbz") pod "ab0c66b4-1ce3-4594-8780-2effddad7043" (UID: "ab0c66b4-1ce3-4594-8780-2effddad7043"). InnerVolumeSpecName "kube-api-access-jjcbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.765809 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data" (OuterVolumeSpecName: "config-data") pod "ab0c66b4-1ce3-4594-8780-2effddad7043" (UID: "ab0c66b4-1ce3-4594-8780-2effddad7043"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.767367 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab0c66b4-1ce3-4594-8780-2effddad7043" (UID: "ab0c66b4-1ce3-4594-8780-2effddad7043"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.813444 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.813495 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.813508 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.813522 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905658 4962 generic.go:334] "Generic (PLEG): container finished" podID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerID="82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" exitCode=0 Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905702 4962 generic.go:334] "Generic (PLEG): container finished" podID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerID="398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" exitCode=143 Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905736 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905765 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerDied","Data":"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66"} Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905831 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerDied","Data":"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13"} Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905846 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerDied","Data":"9cb17bd0f831295fec6db28bdfcd5a1a3d6be987d43cd0f564f65a201a071dcf"} Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905876 4962 scope.go:117] "RemoveContainer" containerID="82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.943733 4962 scope.go:117] "RemoveContainer" containerID="398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.948659 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.960898 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.986882 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:34 crc kubenswrapper[4962]: E0220 10:16:34.987388 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-metadata" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.987409 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-metadata" Feb 20 10:16:34 crc kubenswrapper[4962]: E0220 10:16:34.987436 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-log" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.987444 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-log" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.987627 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-metadata" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.987658 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-log" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.989220 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.996232 4962 scope.go:117] "RemoveContainer" containerID="82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" Feb 20 10:16:34 crc kubenswrapper[4962]: E0220 10:16:34.997902 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66\": container with ID starting with 82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66 not found: ID does not exist" containerID="82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.997963 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66"} err="failed to get container status \"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66\": rpc error: code = NotFound desc = could not find container \"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66\": container with ID starting with 82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66 not found: ID does not exist" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.998003 4962 scope.go:117] "RemoveContainer" containerID="398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" Feb 20 10:16:34 crc kubenswrapper[4962]: E0220 10:16:34.998447 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13\": container with ID starting with 398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13 not found: ID does not exist" containerID="398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.998499 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13"} err="failed to get container status \"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13\": rpc error: code = NotFound desc = could not find container \"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13\": container with ID starting with 398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13 not found: ID does not exist" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.998539 4962 scope.go:117] "RemoveContainer" containerID="82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.999037 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66"} err="failed to get container status \"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66\": rpc error: code = NotFound desc = could not find container \"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66\": container with ID starting with 82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66 not found: ID does not exist" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.999058 4962 scope.go:117] "RemoveContainer" containerID="398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.999284 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13"} err="failed to get container status \"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13\": rpc error: code = NotFound desc = could not find container \"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13\": container with ID starting with 398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13 not found: ID does not exist" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.000664 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.000666 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.011794 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.120046 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.120257 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.120294 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tld9l\" (UniqueName: \"kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.120314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.120339 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.163879 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" path="/var/lib/kubelet/pods/ab0c66b4-1ce3-4594-8780-2effddad7043/volumes" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.222861 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.223022 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.223052 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tld9l\" (UniqueName: \"kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.223082 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.223114 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.225575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.230551 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.233666 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.236032 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.245575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tld9l\" (UniqueName: \"kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.340124 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: W0220 10:16:35.907370 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2aa0dbd_0022_4ee1_8bb9_81a20d6a4abd.slice/crio-69e12a1e1496627bbe3adb2526f1f1562b02a5a3863a278268713ef978db8081 WatchSource:0}: Error finding container 69e12a1e1496627bbe3adb2526f1f1562b02a5a3863a278268713ef978db8081: Status 404 returned error can't find the container with id 69e12a1e1496627bbe3adb2526f1f1562b02a5a3863a278268713ef978db8081 Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.918067 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.942358 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerStarted","Data":"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81"} Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.943017 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerStarted","Data":"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078"} Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.943037 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerStarted","Data":"69e12a1e1496627bbe3adb2526f1f1562b02a5a3863a278268713ef978db8081"} Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.945470 4962 generic.go:334] "Generic (PLEG): container finished" podID="28bfacb3-7247-41ad-bf30-47c81427487b" containerID="1eb9947e80af1012b6145dccb54cd11c0689239b2a15c94816fdca73015d8cfe" exitCode=0 Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.945533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjq4f" event={"ID":"28bfacb3-7247-41ad-bf30-47c81427487b","Type":"ContainerDied","Data":"1eb9947e80af1012b6145dccb54cd11c0689239b2a15c94816fdca73015d8cfe"} Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.973542 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.973516809 podStartE2EDuration="2.973516809s" podCreationTimestamp="2026-02-20 10:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:36.967959947 +0000 UTC m=+1288.550431803" watchObservedRunningTime="2026-02-20 10:16:36.973516809 +0000 UTC m=+1288.555988665" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.420689 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.424953 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.424981 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.464356 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.691689 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.714730 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.714848 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.751765 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.825904 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.826150 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="dnsmasq-dns" containerID="cri-o://8064c9f4f1fa3fedd418dd367b2c5bee617312041e75d25a0c61bc72b1e5d8dc" gracePeriod=10 Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.969323 4962 generic.go:334] "Generic (PLEG): container finished" podID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerID="8064c9f4f1fa3fedd418dd367b2c5bee617312041e75d25a0c61bc72b1e5d8dc" exitCode=0 Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.969409 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" event={"ID":"b97f91e3-f497-47ad-8d3d-f9945b3bdc34","Type":"ContainerDied","Data":"8064c9f4f1fa3fedd418dd367b2c5bee617312041e75d25a0c61bc72b1e5d8dc"} Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.007993 4962 generic.go:334] "Generic (PLEG): container finished" podID="05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" containerID="c43d694b0ea8172a2db698ac63ac57a6cb364529c6c87ffb777fc946029b6b2f" exitCode=0 Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.008296 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" event={"ID":"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d","Type":"ContainerDied","Data":"c43d694b0ea8172a2db698ac63ac57a6cb364529c6c87ffb777fc946029b6b2f"} Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.361481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.599631 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.735817 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle\") pod \"28bfacb3-7247-41ad-bf30-47c81427487b\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.735902 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts\") pod \"28bfacb3-7247-41ad-bf30-47c81427487b\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.735962 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxmdn\" (UniqueName: \"kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn\") pod \"28bfacb3-7247-41ad-bf30-47c81427487b\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.736013 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data\") pod \"28bfacb3-7247-41ad-bf30-47c81427487b\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.743723 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts" (OuterVolumeSpecName: "scripts") pod "28bfacb3-7247-41ad-bf30-47c81427487b" (UID: "28bfacb3-7247-41ad-bf30-47c81427487b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.748064 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn" (OuterVolumeSpecName: "kube-api-access-fxmdn") pod "28bfacb3-7247-41ad-bf30-47c81427487b" (UID: "28bfacb3-7247-41ad-bf30-47c81427487b"). InnerVolumeSpecName "kube-api-access-fxmdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.755846 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.769420 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data" (OuterVolumeSpecName: "config-data") pod "28bfacb3-7247-41ad-bf30-47c81427487b" (UID: "28bfacb3-7247-41ad-bf30-47c81427487b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.784999 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28bfacb3-7247-41ad-bf30-47c81427487b" (UID: "28bfacb3-7247-41ad-bf30-47c81427487b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.804086 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.804152 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838108 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838242 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838299 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838389 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838777 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54bp5\" (UniqueName: \"kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838813 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.839302 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.839315 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.839324 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxmdn\" (UniqueName: \"kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.839335 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.847584 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5" (OuterVolumeSpecName: "kube-api-access-54bp5") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "kube-api-access-54bp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.900711 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config" (OuterVolumeSpecName: "config") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.910907 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.912345 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.915119 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.926321 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.943953 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.944003 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.944017 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.944027 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.944037 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54bp5\" (UniqueName: \"kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.944049 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.018224 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjq4f" event={"ID":"28bfacb3-7247-41ad-bf30-47c81427487b","Type":"ContainerDied","Data":"0f0bc1171f843d6af2226a5d8d968c25ad3b7d5cd1cf52d106ae84ff241ffecd"} Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.018279 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.018281 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f0bc1171f843d6af2226a5d8d968c25ad3b7d5cd1cf52d106ae84ff241ffecd" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.020660 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" event={"ID":"b97f91e3-f497-47ad-8d3d-f9945b3bdc34","Type":"ContainerDied","Data":"a843159467f3af1797e47fcec1255f25b565f911c5d9a7e1acd289df047ed115"} Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.020739 4962 scope.go:117] "RemoveContainer" containerID="8064c9f4f1fa3fedd418dd367b2c5bee617312041e75d25a0c61bc72b1e5d8dc" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.020685 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.072639 4962 scope.go:117] "RemoveContainer" containerID="b058ae21e0210458ea10ea644bf00ea0438ea36899818d84b992ed449c70fc86" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.085381 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.097411 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.237581 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" path="/var/lib/kubelet/pods/b97f91e3-f497-47ad-8d3d-f9945b3bdc34/volumes" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.254692 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.255106 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-log" containerID="cri-o://7c4df354c2a2dd24abe490f4720cd2638f9d36b40005018cdb03e89ec0d07cff" gracePeriod=30 Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.255889 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-api" containerID="cri-o://9a07dac43407609ca971ac0024f06768bd8740aa492e302e9d410517ce47da40" gracePeriod=30 Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.269567 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.270025 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-log" containerID="cri-o://385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" gracePeriod=30 Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.270780 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-metadata" containerID="cri-o://f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" gracePeriod=30 Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.569448 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.576426 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.671280 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5662\" (UniqueName: \"kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662\") pod \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.671408 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data\") pod \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.671629 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts\") pod \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.671696 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle\") pod \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.678009 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts" (OuterVolumeSpecName: "scripts") pod "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" (UID: "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.682256 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662" (OuterVolumeSpecName: "kube-api-access-d5662") pod "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" (UID: "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d"). InnerVolumeSpecName "kube-api-access-d5662". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.717692 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" (UID: "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.719239 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data" (OuterVolumeSpecName: "config-data") pod "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" (UID: "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.775952 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5662\" (UniqueName: \"kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.775985 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.775996 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.776007 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.832209 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.881527 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs\") pod \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.881639 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs\") pod \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.881717 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data\") pod \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.881916 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle\") pod \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.882123 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tld9l\" (UniqueName: \"kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l\") pod \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.884379 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs" (OuterVolumeSpecName: "logs") pod "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" (UID: "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.888533 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l" (OuterVolumeSpecName: "kube-api-access-tld9l") pod "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" (UID: "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd"). InnerVolumeSpecName "kube-api-access-tld9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.922721 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" (UID: "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.922771 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data" (OuterVolumeSpecName: "config-data") pod "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" (UID: "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.960050 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" (UID: "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.984741 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.984803 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.984821 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.984837 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tld9l\" (UniqueName: \"kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.984850 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031780 4962 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerID="f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" exitCode=0 Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031820 4962 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerID="385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" exitCode=143 Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031857 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerDied","Data":"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81"} Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031889 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerDied","Data":"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078"} Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031900 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerDied","Data":"69e12a1e1496627bbe3adb2526f1f1562b02a5a3863a278268713ef978db8081"} Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031917 4962 scope.go:117] "RemoveContainer" containerID="f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.032119 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.033786 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" event={"ID":"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d","Type":"ContainerDied","Data":"864d77909ed3ce2537874b3a198bcea7df1e3a117b50ccd248c61b191ba8d805"} Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.033815 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.033840 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="864d77909ed3ce2537874b3a198bcea7df1e3a117b50ccd248c61b191ba8d805" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.038559 4962 generic.go:334] "Generic (PLEG): container finished" podID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerID="7c4df354c2a2dd24abe490f4720cd2638f9d36b40005018cdb03e89ec0d07cff" exitCode=143 Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.038640 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerDied","Data":"7c4df354c2a2dd24abe490f4720cd2638f9d36b40005018cdb03e89ec0d07cff"} Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.041052 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerName="nova-scheduler-scheduler" containerID="cri-o://e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" gracePeriod=30 Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.085634 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.093375 4962 scope.go:117] "RemoveContainer" containerID="385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.130277 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.138002 4962 scope.go:117] "RemoveContainer" containerID="f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.145711 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81\": container with ID starting with f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81 not found: ID does not exist" containerID="f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.145764 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81"} err="failed to get container status \"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81\": rpc error: code = NotFound desc = could not find container \"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81\": container with ID starting with f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81 not found: ID does not exist" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.145842 4962 scope.go:117] "RemoveContainer" containerID="385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.146462 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078\": container with ID starting with 385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078 not found: ID does not exist" containerID="385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.146487 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078"} err="failed to get container status \"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078\": rpc error: code = NotFound desc = could not find container \"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078\": container with ID starting with 385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078 not found: ID does not exist" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.146505 4962 scope.go:117] "RemoveContainer" containerID="f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.147024 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81"} err="failed to get container status \"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81\": rpc error: code = NotFound desc = could not find container \"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81\": container with ID starting with f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81 not found: ID does not exist" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.147056 4962 scope.go:117] "RemoveContainer" containerID="385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.147844 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078"} err="failed to get container status \"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078\": rpc error: code = NotFound desc = could not find container \"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078\": container with ID starting with 385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078 not found: ID does not exist" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166144 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166552 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28bfacb3-7247-41ad-bf30-47c81427487b" containerName="nova-manage" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166574 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="28bfacb3-7247-41ad-bf30-47c81427487b" containerName="nova-manage" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166586 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="dnsmasq-dns" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166611 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="dnsmasq-dns" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166635 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-metadata" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166644 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-metadata" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166661 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" containerName="nova-cell1-conductor-db-sync" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166669 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" containerName="nova-cell1-conductor-db-sync" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166681 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="init" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166687 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="init" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166702 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-log" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166710 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-log" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166884 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-metadata" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166894 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="dnsmasq-dns" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166905 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-log" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166918 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="28bfacb3-7247-41ad-bf30-47c81427487b" containerName="nova-manage" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166931 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" containerName="nova-cell1-conductor-db-sync" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.167920 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.170562 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.170837 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.199226 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.217760 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.219143 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.221897 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.222742 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.292799 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.292932 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px4mp\" (UniqueName: \"kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.292961 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.293043 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58msc\" (UniqueName: \"kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.293086 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.293127 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.293337 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.293410 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.395970 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396210 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px4mp\" (UniqueName: \"kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396233 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396267 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58msc\" (UniqueName: \"kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396285 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396309 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396421 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.397030 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.403221 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.403429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.404964 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.408274 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.421373 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px4mp\" (UniqueName: \"kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.423002 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.425519 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58msc\" (UniqueName: \"kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.490799 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.539017 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:41 crc kubenswrapper[4962]: W0220 10:16:41.020888 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf680b24_e6dc_40a4_9ee4_521343fd9a28.slice/crio-121ed408ee75be32f31a1f4dc7577730e020e69ee605e01bdd02274a3aab2f53 WatchSource:0}: Error finding container 121ed408ee75be32f31a1f4dc7577730e020e69ee605e01bdd02274a3aab2f53: Status 404 returned error can't find the container with id 121ed408ee75be32f31a1f4dc7577730e020e69ee605e01bdd02274a3aab2f53 Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.031888 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.055081 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerStarted","Data":"121ed408ee75be32f31a1f4dc7577730e020e69ee605e01bdd02274a3aab2f53"} Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.136981 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.201199 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" path="/var/lib/kubelet/pods/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd/volumes" Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.508439 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.509120 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.068117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerStarted","Data":"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e"} Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.068667 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerStarted","Data":"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e"} Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.073223 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ce62af15-166f-4f74-a244-2de5147a4b2f","Type":"ContainerStarted","Data":"2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592"} Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.073286 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ce62af15-166f-4f74-a244-2de5147a4b2f","Type":"ContainerStarted","Data":"4fa1fbefe8085f86ec2949fb3171b5df9f6211664e4db89dbc2b776f71f19d88"} Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.074349 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.097816 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.097788778 podStartE2EDuration="2.097788778s" podCreationTimestamp="2026-02-20 10:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:42.090337599 +0000 UTC m=+1293.672809465" watchObservedRunningTime="2026-02-20 10:16:42.097788778 +0000 UTC m=+1293.680260634" Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.119288 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.119254518 podStartE2EDuration="2.119254518s" podCreationTimestamp="2026-02-20 10:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:42.117229366 +0000 UTC m=+1293.699701212" watchObservedRunningTime="2026-02-20 10:16:42.119254518 +0000 UTC m=+1293.701726374" Feb 20 10:16:42 crc kubenswrapper[4962]: E0220 10:16:42.427408 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:16:42 crc kubenswrapper[4962]: E0220 10:16:42.429901 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:16:42 crc kubenswrapper[4962]: E0220 10:16:42.432084 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:16:42 crc kubenswrapper[4962]: E0220 10:16:42.432151 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerName="nova-scheduler-scheduler" Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.828546 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.829075 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" containerName="kube-state-metrics" containerID="cri-o://fb6cfde9cbec99e03a3f009355d709416019b4ef6eb2150c6b7e98f530e8b57a" gracePeriod=30 Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.097905 4962 generic.go:334] "Generic (PLEG): container finished" podID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" containerID="fb6cfde9cbec99e03a3f009355d709416019b4ef6eb2150c6b7e98f530e8b57a" exitCode=2 Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.098050 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc","Type":"ContainerDied","Data":"fb6cfde9cbec99e03a3f009355d709416019b4ef6eb2150c6b7e98f530e8b57a"} Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.458352 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.567431 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqfr2\" (UniqueName: \"kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2\") pod \"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc\" (UID: \"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc\") " Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.576672 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2" (OuterVolumeSpecName: "kube-api-access-kqfr2") pod "b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" (UID: "b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc"). InnerVolumeSpecName "kube-api-access-kqfr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.670493 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqfr2\" (UniqueName: \"kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.113041 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc","Type":"ContainerDied","Data":"13a063091fc03c82fcdfd2aec7fd6dc34c81370ecc1a377f3c146895e371bf9d"} Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.113538 4962 scope.go:117] "RemoveContainer" containerID="fb6cfde9cbec99e03a3f009355d709416019b4ef6eb2150c6b7e98f530e8b57a" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.113054 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.158216 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.181453 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.193440 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:44 crc kubenswrapper[4962]: E0220 10:16:44.194348 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" containerName="kube-state-metrics" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.194387 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" containerName="kube-state-metrics" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.194801 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" containerName="kube-state-metrics" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.200429 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.206088 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.213145 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.250828 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.285743 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.285874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.285913 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7dpm\" (UniqueName: \"kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.286016 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.387396 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.387507 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.387569 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.387629 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7dpm\" (UniqueName: \"kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.393105 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.394492 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.402960 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.406794 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7dpm\" (UniqueName: \"kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.553877 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.993189 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.993885 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-central-agent" containerID="cri-o://3a1c7b6152c78c256920cd6dc450ddbd613db243bb2b91fc78e1eaf94400c2d6" gracePeriod=30 Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.994068 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="proxy-httpd" containerID="cri-o://93b1b3489e3c8062e10a81096649fbd732605d1f78d868a69f146c92e37a74fa" gracePeriod=30 Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.994122 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="sg-core" containerID="cri-o://e237fa9e8086988f9e78412b6b078a289e696808959ab9d107880a825fcf14c5" gracePeriod=30 Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.994157 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-notification-agent" containerID="cri-o://97e11a6c5e404ce1f4a50771ae7056ddfe4362a2679a8f06158ae96d84b4a250" gracePeriod=30 Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.069716 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.163040 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" path="/var/lib/kubelet/pods/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc/volumes" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.166416 4962 generic.go:334] "Generic (PLEG): container finished" podID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerID="9a07dac43407609ca971ac0024f06768bd8740aa492e302e9d410517ce47da40" exitCode=0 Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.166616 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerDied","Data":"9a07dac43407609ca971ac0024f06768bd8740aa492e302e9d410517ce47da40"} Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.173446 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cffca43e-3e19-4430-8fe2-ca7cfe6229b0","Type":"ContainerStarted","Data":"2851b19111bcc172daacd941571725296e0313b2b3496256066714262e7d3b9a"} Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.174782 4962 generic.go:334] "Generic (PLEG): container finished" podID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerID="e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" exitCode=0 Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.174814 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea","Type":"ContainerDied","Data":"e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3"} Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.274881 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.312999 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.314315 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs\") pod \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.314380 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data\") pod \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.314659 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckgt6\" (UniqueName: \"kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6\") pod \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.314740 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle\") pod \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.315074 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs" (OuterVolumeSpecName: "logs") pod "5989ba7a-f1ca-4a25-a94a-3fea17f16eca" (UID: "5989ba7a-f1ca-4a25-a94a-3fea17f16eca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.315320 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.330189 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6" (OuterVolumeSpecName: "kube-api-access-ckgt6") pod "5989ba7a-f1ca-4a25-a94a-3fea17f16eca" (UID: "5989ba7a-f1ca-4a25-a94a-3fea17f16eca"). InnerVolumeSpecName "kube-api-access-ckgt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.373494 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data" (OuterVolumeSpecName: "config-data") pod "5989ba7a-f1ca-4a25-a94a-3fea17f16eca" (UID: "5989ba7a-f1ca-4a25-a94a-3fea17f16eca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.420654 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckgt6\" (UniqueName: \"kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.422399 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.454754 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5989ba7a-f1ca-4a25-a94a-3fea17f16eca" (UID: "5989ba7a-f1ca-4a25-a94a-3fea17f16eca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.491677 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.493885 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.525261 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s26dw\" (UniqueName: \"kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw\") pod \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.525658 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data\") pod \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.525753 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") pod \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.526179 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.528650 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw" (OuterVolumeSpecName: "kube-api-access-s26dw") pod "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" (UID: "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea"). InnerVolumeSpecName "kube-api-access-s26dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: E0220 10:16:45.553571 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle podName:f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea nodeName:}" failed. No retries permitted until 2026-02-20 10:16:46.053537966 +0000 UTC m=+1297.636009812 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle") pod "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" (UID: "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea") : error deleting /var/lib/kubelet/pods/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea/volume-subpaths: remove /var/lib/kubelet/pods/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea/volume-subpaths: no such file or directory Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.556186 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data" (OuterVolumeSpecName: "config-data") pod "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" (UID: "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.628725 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.628783 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s26dw\" (UniqueName: \"kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.143606 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") pod \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.149126 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" (UID: "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.189625 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerDied","Data":"a7e2d55de2a0689981aefd86aae03373f74c3435dacca5e0c076361b42cb5da4"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.189695 4962 scope.go:117] "RemoveContainer" containerID="9a07dac43407609ca971ac0024f06768bd8740aa492e302e9d410517ce47da40" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.189821 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.198846 4962 generic.go:334] "Generic (PLEG): container finished" podID="457c772c-a7b8-40ea-8573-c483915687be" containerID="93b1b3489e3c8062e10a81096649fbd732605d1f78d868a69f146c92e37a74fa" exitCode=0 Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.198882 4962 generic.go:334] "Generic (PLEG): container finished" podID="457c772c-a7b8-40ea-8573-c483915687be" containerID="e237fa9e8086988f9e78412b6b078a289e696808959ab9d107880a825fcf14c5" exitCode=2 Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.198893 4962 generic.go:334] "Generic (PLEG): container finished" podID="457c772c-a7b8-40ea-8573-c483915687be" containerID="97e11a6c5e404ce1f4a50771ae7056ddfe4362a2679a8f06158ae96d84b4a250" exitCode=0 Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.198901 4962 generic.go:334] "Generic (PLEG): container finished" podID="457c772c-a7b8-40ea-8573-c483915687be" containerID="3a1c7b6152c78c256920cd6dc450ddbd613db243bb2b91fc78e1eaf94400c2d6" exitCode=0 Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.199002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerDied","Data":"93b1b3489e3c8062e10a81096649fbd732605d1f78d868a69f146c92e37a74fa"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.199107 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerDied","Data":"e237fa9e8086988f9e78412b6b078a289e696808959ab9d107880a825fcf14c5"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.199124 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerDied","Data":"97e11a6c5e404ce1f4a50771ae7056ddfe4362a2679a8f06158ae96d84b4a250"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.199157 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerDied","Data":"3a1c7b6152c78c256920cd6dc450ddbd613db243bb2b91fc78e1eaf94400c2d6"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.201864 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.202732 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cffca43e-3e19-4430-8fe2-ca7cfe6229b0","Type":"ContainerStarted","Data":"490c8746de0bc6e3f4ef0520b2658d4424532e972e69bd55a421dfcd9ed32cf4"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.203553 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.206794 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea","Type":"ContainerDied","Data":"503b6078fa852a67bd1c7bbebbc0925a0a08be36053fb3fee5407b0136117e50"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.206864 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.222130 4962 scope.go:117] "RemoveContainer" containerID="7c4df354c2a2dd24abe490f4720cd2638f9d36b40005018cdb03e89ec0d07cff" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.246710 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.246823 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.246911 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.247012 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.247035 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.247076 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg46p\" (UniqueName: \"kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.247231 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.247953 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.249556 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.250694 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.261845 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts" (OuterVolumeSpecName: "scripts") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.274584 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.287881 4962 scope.go:117] "RemoveContainer" containerID="e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.287918 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p" (OuterVolumeSpecName: "kube-api-access-pg46p") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "kube-api-access-pg46p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.302370 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.317760 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318416 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerName="nova-scheduler-scheduler" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318436 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerName="nova-scheduler-scheduler" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318448 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="proxy-httpd" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318457 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="proxy-httpd" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318486 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="sg-core" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318498 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="sg-core" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318513 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-log" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318522 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-log" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318556 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-api" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318564 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-api" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318579 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-central-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318604 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-central-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318623 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-notification-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318634 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-notification-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318919 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="proxy-httpd" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318939 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerName="nova-scheduler-scheduler" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318955 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="sg-core" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.319005 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-notification-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.319023 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-central-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.319036 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-log" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.319047 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-api" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.320926 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.322831 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.334890 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.337054 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.692357641 podStartE2EDuration="2.337030953s" podCreationTimestamp="2026-02-20 10:16:44 +0000 UTC" firstStartedPulling="2026-02-20 10:16:45.077786785 +0000 UTC m=+1296.660258631" lastFinishedPulling="2026-02-20 10:16:45.722460077 +0000 UTC m=+1297.304931943" observedRunningTime="2026-02-20 10:16:46.288392575 +0000 UTC m=+1297.870864421" watchObservedRunningTime="2026-02-20 10:16:46.337030953 +0000 UTC m=+1297.919502799" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.351432 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.351469 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg46p\" (UniqueName: \"kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.351481 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.351489 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.351498 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.374349 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.383552 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.397424 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.413851 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.418717 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.422332 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.425550 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.453624 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.453674 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.453814 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzrbw\" (UniqueName: \"kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.453840 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.453884 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.454880 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data" (OuterVolumeSpecName: "config-data") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.460567 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzrbw\" (UniqueName: \"kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556807 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556870 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2xsh\" (UniqueName: \"kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556924 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556946 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556965 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.557042 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.557148 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.557691 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.563891 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.565279 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.573363 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzrbw\" (UniqueName: \"kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.659074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2xsh\" (UniqueName: \"kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.659172 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.659240 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.665399 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.668904 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.675800 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.677257 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2xsh\" (UniqueName: \"kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.812022 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.157265 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" path="/var/lib/kubelet/pods/5989ba7a-f1ca-4a25-a94a-3fea17f16eca/volumes" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.157987 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" path="/var/lib/kubelet/pods/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea/volumes" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.158744 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: W0220 10:16:47.165285 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a762e59_b6ef_4cdd_81f5_7f49dd78f810.slice/crio-483d238cd5d776f09a0edf06552294d8201c2f1a4a094bae70c03d01ceee2bb1 WatchSource:0}: Error finding container 483d238cd5d776f09a0edf06552294d8201c2f1a4a094bae70c03d01ceee2bb1: Status 404 returned error can't find the container with id 483d238cd5d776f09a0edf06552294d8201c2f1a4a094bae70c03d01ceee2bb1 Feb 20 10:16:47 crc kubenswrapper[4962]: W0220 10:16:47.196821 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebe24b8d_7968_4806_a924_d932f167185f.slice/crio-e64c9159ecdad906d9c9962019b5392b9e58a7c0e3d51ef95e7459a8b2298a1e WatchSource:0}: Error finding container e64c9159ecdad906d9c9962019b5392b9e58a7c0e3d51ef95e7459a8b2298a1e: Status 404 returned error can't find the container with id e64c9159ecdad906d9c9962019b5392b9e58a7c0e3d51ef95e7459a8b2298a1e Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.202464 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.220787 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerDied","Data":"beaabcfa9360010688b1a11e6bc4e4b1e737fa90ac38511bb0966d375d500981"} Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.220814 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.221158 4962 scope.go:117] "RemoveContainer" containerID="93b1b3489e3c8062e10a81096649fbd732605d1f78d868a69f146c92e37a74fa" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.223794 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a762e59-b6ef-4cdd-81f5-7f49dd78f810","Type":"ContainerStarted","Data":"483d238cd5d776f09a0edf06552294d8201c2f1a4a094bae70c03d01ceee2bb1"} Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.228005 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerStarted","Data":"e64c9159ecdad906d9c9962019b5392b9e58a7c0e3d51ef95e7459a8b2298a1e"} Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.246206 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.255894 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.259107 4962 scope.go:117] "RemoveContainer" containerID="e237fa9e8086988f9e78412b6b078a289e696808959ab9d107880a825fcf14c5" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.279758 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.282219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.284830 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.289752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.290671 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.309348 4962 scope.go:117] "RemoveContainer" containerID="97e11a6c5e404ce1f4a50771ae7056ddfe4362a2679a8f06158ae96d84b4a250" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.309351 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376240 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmccz\" (UniqueName: \"kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376274 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376350 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376402 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376441 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376463 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376504 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.377976 4962 scope.go:117] "RemoveContainer" containerID="3a1c7b6152c78c256920cd6dc450ddbd613db243bb2b91fc78e1eaf94400c2d6" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.477999 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478135 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478167 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478197 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmccz\" (UniqueName: \"kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478292 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478493 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478715 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.482042 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.483663 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.483962 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.493928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.499170 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.503201 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmccz\" (UniqueName: \"kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.628139 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.171851 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:48 crc kubenswrapper[4962]: W0220 10:16:48.188945 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c35c04f_5ec6_44c4_99d5_38a896dcae17.slice/crio-9fccdbfe279f8ca88785ee163c57e992e8a04cb5c4252ba831418d3d1e9460d1 WatchSource:0}: Error finding container 9fccdbfe279f8ca88785ee163c57e992e8a04cb5c4252ba831418d3d1e9460d1: Status 404 returned error can't find the container with id 9fccdbfe279f8ca88785ee163c57e992e8a04cb5c4252ba831418d3d1e9460d1 Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.253440 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerStarted","Data":"9fccdbfe279f8ca88785ee163c57e992e8a04cb5c4252ba831418d3d1e9460d1"} Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.279962 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a762e59-b6ef-4cdd-81f5-7f49dd78f810","Type":"ContainerStarted","Data":"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6"} Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.301542 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerStarted","Data":"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d"} Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.301611 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerStarted","Data":"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046"} Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.307899 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.307879074 podStartE2EDuration="2.307879074s" podCreationTimestamp="2026-02-20 10:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:48.305969434 +0000 UTC m=+1299.888441280" watchObservedRunningTime="2026-02-20 10:16:48.307879074 +0000 UTC m=+1299.890350920" Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.331984 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.331957955 podStartE2EDuration="2.331957955s" podCreationTimestamp="2026-02-20 10:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:48.327076335 +0000 UTC m=+1299.909548181" watchObservedRunningTime="2026-02-20 10:16:48.331957955 +0000 UTC m=+1299.914429801" Feb 20 10:16:49 crc kubenswrapper[4962]: I0220 10:16:49.157108 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="457c772c-a7b8-40ea-8573-c483915687be" path="/var/lib/kubelet/pods/457c772c-a7b8-40ea-8573-c483915687be/volumes" Feb 20 10:16:49 crc kubenswrapper[4962]: I0220 10:16:49.327361 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerStarted","Data":"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc"} Feb 20 10:16:50 crc kubenswrapper[4962]: I0220 10:16:50.343961 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerStarted","Data":"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613"} Feb 20 10:16:50 crc kubenswrapper[4962]: I0220 10:16:50.491992 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 10:16:50 crc kubenswrapper[4962]: I0220 10:16:50.492094 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 10:16:50 crc kubenswrapper[4962]: I0220 10:16:50.585013 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:51 crc kubenswrapper[4962]: I0220 10:16:51.368559 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerStarted","Data":"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a"} Feb 20 10:16:51 crc kubenswrapper[4962]: I0220 10:16:51.508961 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:16:51 crc kubenswrapper[4962]: I0220 10:16:51.508975 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:16:51 crc kubenswrapper[4962]: I0220 10:16:51.812908 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 10:16:52 crc kubenswrapper[4962]: I0220 10:16:52.395729 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerStarted","Data":"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766"} Feb 20 10:16:52 crc kubenswrapper[4962]: I0220 10:16:52.397183 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:16:52 crc kubenswrapper[4962]: I0220 10:16:52.426062 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.856780937 podStartE2EDuration="5.42604068s" podCreationTimestamp="2026-02-20 10:16:47 +0000 UTC" firstStartedPulling="2026-02-20 10:16:48.195874565 +0000 UTC m=+1299.778346411" lastFinishedPulling="2026-02-20 10:16:51.765134308 +0000 UTC m=+1303.347606154" observedRunningTime="2026-02-20 10:16:52.421526801 +0000 UTC m=+1304.003998667" watchObservedRunningTime="2026-02-20 10:16:52.42604068 +0000 UTC m=+1304.008512526" Feb 20 10:16:54 crc kubenswrapper[4962]: I0220 10:16:54.570307 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 10:16:56 crc kubenswrapper[4962]: I0220 10:16:56.675987 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:16:56 crc kubenswrapper[4962]: I0220 10:16:56.676479 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:16:56 crc kubenswrapper[4962]: I0220 10:16:56.812477 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 10:16:56 crc kubenswrapper[4962]: I0220 10:16:56.865795 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 10:16:57 crc kubenswrapper[4962]: I0220 10:16:57.482401 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 10:16:57 crc kubenswrapper[4962]: I0220 10:16:57.758859 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 10:16:57 crc kubenswrapper[4962]: I0220 10:16:57.758878 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 10:17:00 crc kubenswrapper[4962]: I0220 10:17:00.503486 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 10:17:00 crc kubenswrapper[4962]: I0220 10:17:00.507199 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 10:17:00 crc kubenswrapper[4962]: I0220 10:17:00.512986 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 10:17:01 crc kubenswrapper[4962]: I0220 10:17:01.517104 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.432704 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.532664 4962 generic.go:334] "Generic (PLEG): container finished" podID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" containerID="a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8" exitCode=137 Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.533678 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.534243 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1d60578e-e3d0-4ae9-8539-9dfd84ebf836","Type":"ContainerDied","Data":"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8"} Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.534338 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1d60578e-e3d0-4ae9-8539-9dfd84ebf836","Type":"ContainerDied","Data":"bc1cbee44de6a5e482a759f896e04c3479b2c9a764a90e45d0331d2a12691504"} Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.534378 4962 scope.go:117] "RemoveContainer" containerID="a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.576552 4962 scope.go:117] "RemoveContainer" containerID="a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8" Feb 20 10:17:03 crc kubenswrapper[4962]: E0220 10:17:03.577171 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8\": container with ID starting with a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8 not found: ID does not exist" containerID="a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.577272 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8"} err="failed to get container status \"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8\": rpc error: code = NotFound desc = could not find container \"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8\": container with ID starting with a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8 not found: ID does not exist" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.595670 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data\") pod \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.595891 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8chs7\" (UniqueName: \"kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7\") pod \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.595993 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle\") pod \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.606510 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7" (OuterVolumeSpecName: "kube-api-access-8chs7") pod "1d60578e-e3d0-4ae9-8539-9dfd84ebf836" (UID: "1d60578e-e3d0-4ae9-8539-9dfd84ebf836"). InnerVolumeSpecName "kube-api-access-8chs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.634395 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d60578e-e3d0-4ae9-8539-9dfd84ebf836" (UID: "1d60578e-e3d0-4ae9-8539-9dfd84ebf836"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.646016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data" (OuterVolumeSpecName: "config-data") pod "1d60578e-e3d0-4ae9-8539-9dfd84ebf836" (UID: "1d60578e-e3d0-4ae9-8539-9dfd84ebf836"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.698993 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.699114 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.699127 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8chs7\" (UniqueName: \"kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.899084 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.918410 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.928310 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:17:03 crc kubenswrapper[4962]: E0220 10:17:03.929110 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.929138 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.929463 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.930430 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.932508 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.932716 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.933573 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.941259 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.106661 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.106869 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.107165 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bznfn\" (UniqueName: \"kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.107309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.107390 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.209532 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bznfn\" (UniqueName: \"kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.209586 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.209637 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.209686 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.209760 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.215194 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.225211 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.231937 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.232854 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.248065 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bznfn\" (UniqueName: \"kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.255669 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.806749 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:17:04 crc kubenswrapper[4962]: W0220 10:17:04.811467 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdf8f82d_76e8_4d49_ab1f_bc75cec4dc00.slice/crio-33d56931989951f09c69fe90d6e65d85c8e97ea86a78f0d42f65def6270a08a7 WatchSource:0}: Error finding container 33d56931989951f09c69fe90d6e65d85c8e97ea86a78f0d42f65def6270a08a7: Status 404 returned error can't find the container with id 33d56931989951f09c69fe90d6e65d85c8e97ea86a78f0d42f65def6270a08a7 Feb 20 10:17:05 crc kubenswrapper[4962]: I0220 10:17:05.164888 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" path="/var/lib/kubelet/pods/1d60578e-e3d0-4ae9-8539-9dfd84ebf836/volumes" Feb 20 10:17:05 crc kubenswrapper[4962]: I0220 10:17:05.568262 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00","Type":"ContainerStarted","Data":"aa9b7812a81805d3c1d048e75378c2e89e7f075bbe36af5665b4416075da7b83"} Feb 20 10:17:05 crc kubenswrapper[4962]: I0220 10:17:05.568342 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00","Type":"ContainerStarted","Data":"33d56931989951f09c69fe90d6e65d85c8e97ea86a78f0d42f65def6270a08a7"} Feb 20 10:17:05 crc kubenswrapper[4962]: I0220 10:17:05.604348 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.604317628 podStartE2EDuration="2.604317628s" podCreationTimestamp="2026-02-20 10:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:05.59722178 +0000 UTC m=+1317.179693676" watchObservedRunningTime="2026-02-20 10:17:05.604317628 +0000 UTC m=+1317.186789514" Feb 20 10:17:06 crc kubenswrapper[4962]: I0220 10:17:06.686253 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 10:17:06 crc kubenswrapper[4962]: I0220 10:17:06.687445 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 10:17:06 crc kubenswrapper[4962]: I0220 10:17:06.688494 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 10:17:06 crc kubenswrapper[4962]: I0220 10:17:06.693870 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 10:17:07 crc kubenswrapper[4962]: I0220 10:17:07.599167 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 10:17:07 crc kubenswrapper[4962]: I0220 10:17:07.604226 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 10:17:07 crc kubenswrapper[4962]: I0220 10:17:07.837655 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:17:07 crc kubenswrapper[4962]: I0220 10:17:07.839470 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:07 crc kubenswrapper[4962]: I0220 10:17:07.859138 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014260 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014330 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014442 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014465 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014491 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014517 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nss8l\" (UniqueName: \"kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.116789 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.116850 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.116968 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.116989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.117014 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.117044 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nss8l\" (UniqueName: \"kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.118268 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.118331 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.118344 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.118704 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.118931 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.137283 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nss8l\" (UniqueName: \"kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.188156 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.684790 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:17:08 crc kubenswrapper[4962]: W0220 10:17:08.688398 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e4f70a2_b8ae_48cc_a098_5642fad8b040.slice/crio-43990fb41b7e9e93f7abc5c81e14ddd0fcd4df0bd08b0a99fd55dd59749b0c05 WatchSource:0}: Error finding container 43990fb41b7e9e93f7abc5c81e14ddd0fcd4df0bd08b0a99fd55dd59749b0c05: Status 404 returned error can't find the container with id 43990fb41b7e9e93f7abc5c81e14ddd0fcd4df0bd08b0a99fd55dd59749b0c05 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.255786 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.617887 4962 generic.go:334] "Generic (PLEG): container finished" podID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerID="b6772b9162a6a32cfbe3b48349f45c3e39e34e153494f4b09b124b0a0f86db0c" exitCode=0 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.618048 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" event={"ID":"2e4f70a2-b8ae-48cc-a098-5642fad8b040","Type":"ContainerDied","Data":"b6772b9162a6a32cfbe3b48349f45c3e39e34e153494f4b09b124b0a0f86db0c"} Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.618157 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" event={"ID":"2e4f70a2-b8ae-48cc-a098-5642fad8b040","Type":"ContainerStarted","Data":"43990fb41b7e9e93f7abc5c81e14ddd0fcd4df0bd08b0a99fd55dd59749b0c05"} Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.744638 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.745279 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-central-agent" containerID="cri-o://99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc" gracePeriod=30 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.746213 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="proxy-httpd" containerID="cri-o://caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766" gracePeriod=30 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.746354 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="sg-core" containerID="cri-o://0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a" gracePeriod=30 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.746462 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-notification-agent" containerID="cri-o://22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613" gracePeriod=30 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.765946 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.197:3000/\": read tcp 10.217.0.2:33678->10.217.0.197:3000: read: connection reset by peer" Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.259224 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637478 4962 generic.go:334] "Generic (PLEG): container finished" podID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerID="caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766" exitCode=0 Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637516 4962 generic.go:334] "Generic (PLEG): container finished" podID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerID="0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a" exitCode=2 Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637524 4962 generic.go:334] "Generic (PLEG): container finished" podID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerID="99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc" exitCode=0 Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637748 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerDied","Data":"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766"} Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerDied","Data":"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a"} Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637798 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerDied","Data":"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc"} Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.648375 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-log" containerID="cri-o://16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046" gracePeriod=30 Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.649972 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" event={"ID":"2e4f70a2-b8ae-48cc-a098-5642fad8b040","Type":"ContainerStarted","Data":"337cc3322a86ac4051b60ce8c7418dd0f1ccf4eafea40f3e9c75cc1f12e67b28"} Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.650017 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.650479 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-api" containerID="cri-o://a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d" gracePeriod=30 Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.689209 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" podStartSLOduration=3.689186894 podStartE2EDuration="3.689186894s" podCreationTimestamp="2026-02-20 10:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:10.682697574 +0000 UTC m=+1322.265169440" watchObservedRunningTime="2026-02-20 10:17:10.689186894 +0000 UTC m=+1322.271658740" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.216025 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.401684 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402158 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402262 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402366 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402386 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402520 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402604 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmccz\" (UniqueName: \"kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.403237 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.403705 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.410235 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz" (OuterVolumeSpecName: "kube-api-access-nmccz") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "kube-api-access-nmccz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.412724 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts" (OuterVolumeSpecName: "scripts") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.444680 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.463875 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505279 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmccz\" (UniqueName: \"kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505315 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505328 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505338 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505347 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505356 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.506735 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.508138 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.508195 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.508247 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.509196 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.509263 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716" gracePeriod=600 Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.544946 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data" (OuterVolumeSpecName: "config-data") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.607673 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.607712 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.671107 4962 generic.go:334] "Generic (PLEG): container finished" podID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerID="22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613" exitCode=0 Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.671213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerDied","Data":"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613"} Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.671254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerDied","Data":"9fccdbfe279f8ca88785ee163c57e992e8a04cb5c4252ba831418d3d1e9460d1"} Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.671281 4962 scope.go:117] "RemoveContainer" containerID="caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.671484 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.704127 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716"} Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.704078 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716" exitCode=0 Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.706754 4962 generic.go:334] "Generic (PLEG): container finished" podID="ebe24b8d-7968-4806-a924-d932f167185f" containerID="16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046" exitCode=143 Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.708653 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerDied","Data":"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046"} Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.733842 4962 scope.go:117] "RemoveContainer" containerID="0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.741413 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.759267 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.770112 4962 scope.go:117] "RemoveContainer" containerID="22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.777855 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.778429 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-notification-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778450 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-notification-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.778474 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="sg-core" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778482 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="sg-core" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.778507 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="proxy-httpd" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778513 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="proxy-httpd" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.778533 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-central-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778539 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-central-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778800 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-notification-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778813 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="proxy-httpd" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778833 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-central-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778842 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="sg-core" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.781074 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.782878 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.784976 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.785136 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.801462 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.813553 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814287 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814405 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814427 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814536 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb8hn\" (UniqueName: \"kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814563 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814671 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814698 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.816805 4962 scope.go:117] "RemoveContainer" containerID="99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.842911 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.846780 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-zb8hn log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="3122f194-31cc-4b80-93ce-20c0ab55f4dd" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.855233 4962 scope.go:117] "RemoveContainer" containerID="caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.855734 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766\": container with ID starting with caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766 not found: ID does not exist" containerID="caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.857304 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766"} err="failed to get container status \"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766\": rpc error: code = NotFound desc = could not find container \"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766\": container with ID starting with caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766 not found: ID does not exist" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.857394 4962 scope.go:117] "RemoveContainer" containerID="0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.860060 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a\": container with ID starting with 0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a not found: ID does not exist" containerID="0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.860121 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a"} err="failed to get container status \"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a\": rpc error: code = NotFound desc = could not find container \"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a\": container with ID starting with 0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a not found: ID does not exist" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.860219 4962 scope.go:117] "RemoveContainer" containerID="22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.861308 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613\": container with ID starting with 22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613 not found: ID does not exist" containerID="22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.861372 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613"} err="failed to get container status \"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613\": rpc error: code = NotFound desc = could not find container \"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613\": container with ID starting with 22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613 not found: ID does not exist" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.861406 4962 scope.go:117] "RemoveContainer" containerID="99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.862517 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc\": container with ID starting with 99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc not found: ID does not exist" containerID="99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.862546 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc"} err="failed to get container status \"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc\": rpc error: code = NotFound desc = could not find container \"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc\": container with ID starting with 99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc not found: ID does not exist" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.862567 4962 scope.go:117] "RemoveContainer" containerID="d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915483 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915641 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915679 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915718 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915742 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915772 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb8hn\" (UniqueName: \"kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915803 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915824 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.917178 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.917320 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.922959 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.926392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.926748 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.930613 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.938004 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.941712 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb8hn\" (UniqueName: \"kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.723561 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.723543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e"} Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.734477 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.834442 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836033 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836222 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836386 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836507 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb8hn\" (UniqueName: \"kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836621 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836719 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836873 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.839101 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts" (OuterVolumeSpecName: "scripts") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.839351 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.841883 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.842409 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.845956 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn" (OuterVolumeSpecName: "kube-api-access-zb8hn") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "kube-api-access-zb8hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.846651 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data" (OuterVolumeSpecName: "config-data") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.847904 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.848350 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939634 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939686 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb8hn\" (UniqueName: \"kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939705 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939717 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939728 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939738 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939752 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939766 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.154378 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" path="/var/lib/kubelet/pods/9c35c04f-5ec6-44c4-99d5-38a896dcae17/volumes" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.736861 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.831278 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.845930 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.856561 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.865585 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.868671 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.869217 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.869355 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.907786 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.969645 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.969699 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.969832 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.969882 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.969957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.970023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.970152 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.970242 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvmhh\" (UniqueName: \"kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.071709 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.071808 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.071887 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.071992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.072099 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.072190 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvmhh\" (UniqueName: \"kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.072261 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.072329 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.073252 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.077133 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.079631 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.080185 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.080729 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.100727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.100840 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.107464 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvmhh\" (UniqueName: \"kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.249136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.256612 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.286037 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.325884 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.379768 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle\") pod \"ebe24b8d-7968-4806-a924-d932f167185f\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.379894 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs\") pod \"ebe24b8d-7968-4806-a924-d932f167185f\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.380019 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzrbw\" (UniqueName: \"kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw\") pod \"ebe24b8d-7968-4806-a924-d932f167185f\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.380148 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data\") pod \"ebe24b8d-7968-4806-a924-d932f167185f\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.383253 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs" (OuterVolumeSpecName: "logs") pod "ebe24b8d-7968-4806-a924-d932f167185f" (UID: "ebe24b8d-7968-4806-a924-d932f167185f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.388777 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw" (OuterVolumeSpecName: "kube-api-access-nzrbw") pod "ebe24b8d-7968-4806-a924-d932f167185f" (UID: "ebe24b8d-7968-4806-a924-d932f167185f"). InnerVolumeSpecName "kube-api-access-nzrbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.436792 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data" (OuterVolumeSpecName: "config-data") pod "ebe24b8d-7968-4806-a924-d932f167185f" (UID: "ebe24b8d-7968-4806-a924-d932f167185f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.447694 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebe24b8d-7968-4806-a924-d932f167185f" (UID: "ebe24b8d-7968-4806-a924-d932f167185f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.487398 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.487439 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.487452 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzrbw\" (UniqueName: \"kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.487463 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.756480 4962 generic.go:334] "Generic (PLEG): container finished" podID="ebe24b8d-7968-4806-a924-d932f167185f" containerID="a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d" exitCode=0 Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.757998 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.759016 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerDied","Data":"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d"} Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.759121 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerDied","Data":"e64c9159ecdad906d9c9962019b5392b9e58a7c0e3d51ef95e7459a8b2298a1e"} Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.759252 4962 scope.go:117] "RemoveContainer" containerID="a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.800391 4962 scope.go:117] "RemoveContainer" containerID="16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.800649 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.810181 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:14 crc kubenswrapper[4962]: W0220 10:17:14.811808 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfae69c76_754d_4125_a405_23a3938e90a9.slice/crio-498a8615ffc0d02ef23136be3c7f8346a8aa655c4297248e31b8a2413028fcd9 WatchSource:0}: Error finding container 498a8615ffc0d02ef23136be3c7f8346a8aa655c4297248e31b8a2413028fcd9: Status 404 returned error can't find the container with id 498a8615ffc0d02ef23136be3c7f8346a8aa655c4297248e31b8a2413028fcd9 Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.841227 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.853104 4962 scope.go:117] "RemoveContainer" containerID="a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d" Feb 20 10:17:14 crc kubenswrapper[4962]: E0220 10:17:14.856691 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d\": container with ID starting with a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d not found: ID does not exist" containerID="a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.857144 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d"} err="failed to get container status \"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d\": rpc error: code = NotFound desc = could not find container \"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d\": container with ID starting with a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d not found: ID does not exist" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.857180 4962 scope.go:117] "RemoveContainer" containerID="16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046" Feb 20 10:17:14 crc kubenswrapper[4962]: E0220 10:17:14.858884 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046\": container with ID starting with 16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046 not found: ID does not exist" containerID="16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.858980 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046"} err="failed to get container status \"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046\": rpc error: code = NotFound desc = could not find container \"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046\": container with ID starting with 16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046 not found: ID does not exist" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.867506 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.879666 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:14 crc kubenswrapper[4962]: E0220 10:17:14.880241 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-api" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.880268 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-api" Feb 20 10:17:14 crc kubenswrapper[4962]: E0220 10:17:14.880316 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-log" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.880323 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-log" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.880527 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-log" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.880559 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-api" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.882227 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.887078 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.887408 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.887795 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896086 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896142 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtxfs\" (UniqueName: \"kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896235 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896320 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.958370 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999218 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999350 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtxfs\" (UniqueName: \"kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999376 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999431 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999483 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.004658 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.009407 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.016387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.017172 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.038304 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.042206 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtxfs\" (UniqueName: \"kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.118670 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mm68z"] Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.120226 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.126037 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.126273 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.128814 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mm68z"] Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.174705 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3122f194-31cc-4b80-93ce-20c0ab55f4dd" path="/var/lib/kubelet/pods/3122f194-31cc-4b80-93ce-20c0ab55f4dd/volumes" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.175474 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe24b8d-7968-4806-a924-d932f167185f" path="/var/lib/kubelet/pods/ebe24b8d-7968-4806-a924-d932f167185f/volumes" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.205553 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.205818 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.205925 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrtxr\" (UniqueName: \"kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.205955 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.251626 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.306745 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.306850 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.306906 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrtxr\" (UniqueName: \"kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.306934 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.311571 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.312157 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.318051 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.328964 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrtxr\" (UniqueName: \"kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.459213 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.742695 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.774568 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerStarted","Data":"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a"} Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.774635 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerStarted","Data":"498a8615ffc0d02ef23136be3c7f8346a8aa655c4297248e31b8a2413028fcd9"} Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.777753 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerStarted","Data":"125fe601dfdf6769c35ec31a4db3fb414e225c1f0afbec478eb5d8be4fdc6a86"} Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.947871 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mm68z"] Feb 20 10:17:15 crc kubenswrapper[4962]: W0220 10:17:15.959962 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39a7b81e_d4af_478f_b2c3_d21f117ad7ec.slice/crio-2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364 WatchSource:0}: Error finding container 2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364: Status 404 returned error can't find the container with id 2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364 Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.789477 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerStarted","Data":"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917"} Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.792922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mm68z" event={"ID":"39a7b81e-d4af-478f-b2c3-d21f117ad7ec","Type":"ContainerStarted","Data":"f9e4860c3043e0b48490e065c36e81c4bc365aa4ac0725e20676491c7054e577"} Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.792980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mm68z" event={"ID":"39a7b81e-d4af-478f-b2c3-d21f117ad7ec","Type":"ContainerStarted","Data":"2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364"} Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.795109 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerStarted","Data":"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4"} Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.795164 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerStarted","Data":"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06"} Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.812275 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mm68z" podStartSLOduration=1.81225799 podStartE2EDuration="1.81225799s" podCreationTimestamp="2026-02-20 10:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:16.81097792 +0000 UTC m=+1328.393449766" watchObservedRunningTime="2026-02-20 10:17:16.81225799 +0000 UTC m=+1328.394729836" Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.838350 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.838328473 podStartE2EDuration="2.838328473s" podCreationTimestamp="2026-02-20 10:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:16.832669029 +0000 UTC m=+1328.415140875" watchObservedRunningTime="2026-02-20 10:17:16.838328473 +0000 UTC m=+1328.420800319" Feb 20 10:17:17 crc kubenswrapper[4962]: I0220 10:17:17.807653 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerStarted","Data":"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e"} Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.189792 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.277393 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.278343 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="dnsmasq-dns" containerID="cri-o://3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858" gracePeriod=10 Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.789058 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.821484 4962 generic.go:334] "Generic (PLEG): container finished" podID="619a1578-177c-476f-a471-e39ec43ebf20" containerID="3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858" exitCode=0 Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.821606 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" event={"ID":"619a1578-177c-476f-a471-e39ec43ebf20","Type":"ContainerDied","Data":"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858"} Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.821648 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" event={"ID":"619a1578-177c-476f-a471-e39ec43ebf20","Type":"ContainerDied","Data":"323c0906415aaaf20c526ccc0a5760d7fccfd56336d524a538bc100ce5c3c6b2"} Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.821671 4962 scope.go:117] "RemoveContainer" containerID="3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.821813 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.828748 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerStarted","Data":"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5"} Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.828960 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853161 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853242 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853409 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853447 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853517 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853616 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnccj\" (UniqueName: \"kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.865060 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj" (OuterVolumeSpecName: "kube-api-access-fnccj") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "kube-api-access-fnccj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.867151 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5195883439999998 podStartE2EDuration="5.867121649s" podCreationTimestamp="2026-02-20 10:17:13 +0000 UTC" firstStartedPulling="2026-02-20 10:17:14.856246466 +0000 UTC m=+1326.438718312" lastFinishedPulling="2026-02-20 10:17:18.203779771 +0000 UTC m=+1329.786251617" observedRunningTime="2026-02-20 10:17:18.856020827 +0000 UTC m=+1330.438492673" watchObservedRunningTime="2026-02-20 10:17:18.867121649 +0000 UTC m=+1330.449593495" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.869644 4962 scope.go:117] "RemoveContainer" containerID="6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.890465 4962 scope.go:117] "RemoveContainer" containerID="3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858" Feb 20 10:17:18 crc kubenswrapper[4962]: E0220 10:17:18.891063 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858\": container with ID starting with 3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858 not found: ID does not exist" containerID="3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.891119 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858"} err="failed to get container status \"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858\": rpc error: code = NotFound desc = could not find container \"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858\": container with ID starting with 3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858 not found: ID does not exist" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.891154 4962 scope.go:117] "RemoveContainer" containerID="6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a" Feb 20 10:17:18 crc kubenswrapper[4962]: E0220 10:17:18.891570 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a\": container with ID starting with 6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a not found: ID does not exist" containerID="6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.891606 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a"} err="failed to get container status \"6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a\": rpc error: code = NotFound desc = could not find container \"6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a\": container with ID starting with 6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a not found: ID does not exist" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.923738 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config" (OuterVolumeSpecName: "config") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.923748 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.927725 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.929006 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.937615 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956102 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956156 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956170 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956187 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956199 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956208 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnccj\" (UniqueName: \"kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:19 crc kubenswrapper[4962]: I0220 10:17:19.159455 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:17:19 crc kubenswrapper[4962]: I0220 10:17:19.170774 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:17:21 crc kubenswrapper[4962]: I0220 10:17:21.167454 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619a1578-177c-476f-a471-e39ec43ebf20" path="/var/lib/kubelet/pods/619a1578-177c-476f-a471-e39ec43ebf20/volumes" Feb 20 10:17:21 crc kubenswrapper[4962]: I0220 10:17:21.892307 4962 generic.go:334] "Generic (PLEG): container finished" podID="39a7b81e-d4af-478f-b2c3-d21f117ad7ec" containerID="f9e4860c3043e0b48490e065c36e81c4bc365aa4ac0725e20676491c7054e577" exitCode=0 Feb 20 10:17:21 crc kubenswrapper[4962]: I0220 10:17:21.892413 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mm68z" event={"ID":"39a7b81e-d4af-478f-b2c3-d21f117ad7ec","Type":"ContainerDied","Data":"f9e4860c3043e0b48490e065c36e81c4bc365aa4ac0725e20676491c7054e577"} Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.344971 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.382853 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrtxr\" (UniqueName: \"kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr\") pod \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.383109 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data\") pod \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.383368 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle\") pod \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.383555 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts\") pod \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.416270 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts" (OuterVolumeSpecName: "scripts") pod "39a7b81e-d4af-478f-b2c3-d21f117ad7ec" (UID: "39a7b81e-d4af-478f-b2c3-d21f117ad7ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.416748 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr" (OuterVolumeSpecName: "kube-api-access-xrtxr") pod "39a7b81e-d4af-478f-b2c3-d21f117ad7ec" (UID: "39a7b81e-d4af-478f-b2c3-d21f117ad7ec"). InnerVolumeSpecName "kube-api-access-xrtxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.423251 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39a7b81e-d4af-478f-b2c3-d21f117ad7ec" (UID: "39a7b81e-d4af-478f-b2c3-d21f117ad7ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.443284 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data" (OuterVolumeSpecName: "config-data") pod "39a7b81e-d4af-478f-b2c3-d21f117ad7ec" (UID: "39a7b81e-d4af-478f-b2c3-d21f117ad7ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.485834 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.485873 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrtxr\" (UniqueName: \"kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.485889 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.485900 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.922515 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mm68z" event={"ID":"39a7b81e-d4af-478f-b2c3-d21f117ad7ec","Type":"ContainerDied","Data":"2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364"} Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.922565 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.922663 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.159893 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.160476 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-log" containerID="cri-o://2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" gracePeriod=30 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.160655 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-api" containerID="cri-o://c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" gracePeriod=30 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.174263 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.174505 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" containerName="nova-scheduler-scheduler" containerID="cri-o://8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6" gracePeriod=30 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.216108 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.216875 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" containerID="cri-o://0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e" gracePeriod=30 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.217429 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" containerID="cri-o://a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e" gracePeriod=30 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.818668 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.921109 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtxfs\" (UniqueName: \"kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.921238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.922342 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.922435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.922543 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.922776 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.923182 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs" (OuterVolumeSpecName: "logs") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.923494 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.930883 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs" (OuterVolumeSpecName: "kube-api-access-xtxfs") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "kube-api-access-xtxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.937557 4962 generic.go:334] "Generic (PLEG): container finished" podID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerID="0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e" exitCode=143 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.937636 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerDied","Data":"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e"} Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940661 4962 generic.go:334] "Generic (PLEG): container finished" podID="8ede4992-1b80-4f08-a232-84f283cfedde" containerID="c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" exitCode=0 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940692 4962 generic.go:334] "Generic (PLEG): container finished" podID="8ede4992-1b80-4f08-a232-84f283cfedde" containerID="2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" exitCode=143 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940718 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerDied","Data":"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4"} Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940750 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerDied","Data":"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06"} Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940761 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerDied","Data":"125fe601dfdf6769c35ec31a4db3fb414e225c1f0afbec478eb5d8be4fdc6a86"} Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940778 4962 scope.go:117] "RemoveContainer" containerID="c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940819 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.953988 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.954737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data" (OuterVolumeSpecName: "config-data") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.980514 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.995782 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.006930 4962 scope.go:117] "RemoveContainer" containerID="2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.025524 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.025564 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtxfs\" (UniqueName: \"kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.025579 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.025609 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.025622 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.028366 4962 scope.go:117] "RemoveContainer" containerID="c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.028767 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4\": container with ID starting with c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4 not found: ID does not exist" containerID="c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.028810 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4"} err="failed to get container status \"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4\": rpc error: code = NotFound desc = could not find container \"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4\": container with ID starting with c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4 not found: ID does not exist" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.028839 4962 scope.go:117] "RemoveContainer" containerID="2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.029362 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06\": container with ID starting with 2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06 not found: ID does not exist" containerID="2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.029393 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06"} err="failed to get container status \"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06\": rpc error: code = NotFound desc = could not find container \"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06\": container with ID starting with 2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06 not found: ID does not exist" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.029411 4962 scope.go:117] "RemoveContainer" containerID="c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.029720 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4"} err="failed to get container status \"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4\": rpc error: code = NotFound desc = could not find container \"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4\": container with ID starting with c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4 not found: ID does not exist" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.029743 4962 scope.go:117] "RemoveContainer" containerID="2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.030013 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06"} err="failed to get container status \"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06\": rpc error: code = NotFound desc = could not find container \"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06\": container with ID starting with 2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06 not found: ID does not exist" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.275411 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.294229 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.306175 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.306909 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a7b81e-d4af-478f-b2c3-d21f117ad7ec" containerName="nova-manage" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.306979 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a7b81e-d4af-478f-b2c3-d21f117ad7ec" containerName="nova-manage" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.307051 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-api" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307104 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-api" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.307165 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="init" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307216 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="init" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.307274 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-log" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307332 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-log" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.307412 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="dnsmasq-dns" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307465 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="dnsmasq-dns" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307732 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a7b81e-d4af-478f-b2c3-d21f117ad7ec" containerName="nova-manage" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307807 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-log" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307889 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-api" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307953 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="dnsmasq-dns" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.309012 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.311948 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.312795 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.313031 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.340183 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.436739 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.436961 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.437016 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.437393 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.437488 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.437636 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qmjz\" (UniqueName: \"kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.541137 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.541758 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.542669 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qmjz\" (UniqueName: \"kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.542792 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.543002 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.543049 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.541699 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.547454 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.549196 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.549906 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.555413 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.586111 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qmjz\" (UniqueName: \"kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.655675 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.238173 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:26 crc kubenswrapper[4962]: W0220 10:17:26.245358 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod241dc417_3176_4051_ad4e_d98f4f66ddc2.slice/crio-6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe WatchSource:0}: Error finding container 6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe: Status 404 returned error can't find the container with id 6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.433317 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.572066 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle\") pod \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.572715 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data\") pod \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.572848 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2xsh\" (UniqueName: \"kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh\") pod \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.576510 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh" (OuterVolumeSpecName: "kube-api-access-v2xsh") pod "2a762e59-b6ef-4cdd-81f5-7f49dd78f810" (UID: "2a762e59-b6ef-4cdd-81f5-7f49dd78f810"). InnerVolumeSpecName "kube-api-access-v2xsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.606069 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data" (OuterVolumeSpecName: "config-data") pod "2a762e59-b6ef-4cdd-81f5-7f49dd78f810" (UID: "2a762e59-b6ef-4cdd-81f5-7f49dd78f810"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.609443 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a762e59-b6ef-4cdd-81f5-7f49dd78f810" (UID: "2a762e59-b6ef-4cdd-81f5-7f49dd78f810"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.675617 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.675673 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.675691 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2xsh\" (UniqueName: \"kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.968624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerStarted","Data":"42f33c3ac4e84257c4f38d060186abe1300d7dfb20f8894c1b519bb38d1529c9"} Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.968671 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerStarted","Data":"d6bf8640027e8b75225f36e2b4a5d790818a0e4259c4c5012d627c79a493efb3"} Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.968688 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerStarted","Data":"6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe"} Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.971328 4962 generic.go:334] "Generic (PLEG): container finished" podID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" containerID="8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6" exitCode=0 Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.971370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a762e59-b6ef-4cdd-81f5-7f49dd78f810","Type":"ContainerDied","Data":"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6"} Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.971393 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a762e59-b6ef-4cdd-81f5-7f49dd78f810","Type":"ContainerDied","Data":"483d238cd5d776f09a0edf06552294d8201c2f1a4a094bae70c03d01ceee2bb1"} Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.971411 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.971429 4962 scope.go:117] "RemoveContainer" containerID="8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.001959 4962 scope.go:117] "RemoveContainer" containerID="8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6" Feb 20 10:17:27 crc kubenswrapper[4962]: E0220 10:17:27.006863 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6\": container with ID starting with 8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6 not found: ID does not exist" containerID="8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.006915 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6"} err="failed to get container status \"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6\": rpc error: code = NotFound desc = could not find container \"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6\": container with ID starting with 8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6 not found: ID does not exist" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.049238 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.049212832 podStartE2EDuration="2.049212832s" podCreationTimestamp="2026-02-20 10:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:27.007948362 +0000 UTC m=+1338.590420238" watchObservedRunningTime="2026-02-20 10:17:27.049212832 +0000 UTC m=+1338.631684688" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.052491 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.092646 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.107354 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:27 crc kubenswrapper[4962]: E0220 10:17:27.107899 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" containerName="nova-scheduler-scheduler" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.107918 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" containerName="nova-scheduler-scheduler" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.108104 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" containerName="nova-scheduler-scheduler" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.108908 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.116293 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.122375 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.199646 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvxk6\" (UniqueName: \"kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.200117 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.200153 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.206851 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" path="/var/lib/kubelet/pods/2a762e59-b6ef-4cdd-81f5-7f49dd78f810/volumes" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.213286 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" path="/var/lib/kubelet/pods/8ede4992-1b80-4f08-a232-84f283cfedde/volumes" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.301960 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.302037 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.302124 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvxk6\" (UniqueName: \"kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.312485 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.312854 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.329836 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvxk6\" (UniqueName: \"kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.389511 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:41048->10.217.0.192:8775: read: connection reset by peer" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.389486 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:41060->10.217.0.192:8775: read: connection reset by peer" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.439850 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.800174 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.916512 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle\") pod \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.916606 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58msc\" (UniqueName: \"kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc\") pod \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.916806 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data\") pod \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.916899 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs\") pod \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.916922 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs\") pod \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.918047 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs" (OuterVolumeSpecName: "logs") pod "bf680b24-e6dc-40a4-9ee4-521343fd9a28" (UID: "bf680b24-e6dc-40a4-9ee4-521343fd9a28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.922583 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc" (OuterVolumeSpecName: "kube-api-access-58msc") pod "bf680b24-e6dc-40a4-9ee4-521343fd9a28" (UID: "bf680b24-e6dc-40a4-9ee4-521343fd9a28"). InnerVolumeSpecName "kube-api-access-58msc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.955425 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data" (OuterVolumeSpecName: "config-data") pod "bf680b24-e6dc-40a4-9ee4-521343fd9a28" (UID: "bf680b24-e6dc-40a4-9ee4-521343fd9a28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.961699 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf680b24-e6dc-40a4-9ee4-521343fd9a28" (UID: "bf680b24-e6dc-40a4-9ee4-521343fd9a28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.977417 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bf680b24-e6dc-40a4-9ee4-521343fd9a28" (UID: "bf680b24-e6dc-40a4-9ee4-521343fd9a28"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.992097 4962 generic.go:334] "Generic (PLEG): container finished" podID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerID="a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e" exitCode=0 Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.992207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerDied","Data":"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e"} Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.992264 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerDied","Data":"121ed408ee75be32f31a1f4dc7577730e020e69ee605e01bdd02274a3aab2f53"} Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.992292 4962 scope.go:117] "RemoveContainer" containerID="a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.992248 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.999776 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:28 crc kubenswrapper[4962]: W0220 10:17:28.009348 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcd02115_2eb9_4090_8225_108c3a8cad20.slice/crio-a6be5e2b469a4dd84e09bc3f569eccb10479b9448520269901b4d42cca661dde WatchSource:0}: Error finding container a6be5e2b469a4dd84e09bc3f569eccb10479b9448520269901b4d42cca661dde: Status 404 returned error can't find the container with id a6be5e2b469a4dd84e09bc3f569eccb10479b9448520269901b4d42cca661dde Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.019561 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.019614 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58msc\" (UniqueName: \"kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.019628 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.019646 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.019666 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.040345 4962 scope.go:117] "RemoveContainer" containerID="0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.071891 4962 scope.go:117] "RemoveContainer" containerID="a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e" Feb 20 10:17:28 crc kubenswrapper[4962]: E0220 10:17:28.076854 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e\": container with ID starting with a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e not found: ID does not exist" containerID="a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.076924 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e"} err="failed to get container status \"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e\": rpc error: code = NotFound desc = could not find container \"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e\": container with ID starting with a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e not found: ID does not exist" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.076965 4962 scope.go:117] "RemoveContainer" containerID="0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e" Feb 20 10:17:28 crc kubenswrapper[4962]: E0220 10:17:28.077345 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e\": container with ID starting with 0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e not found: ID does not exist" containerID="0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.077372 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e"} err="failed to get container status \"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e\": rpc error: code = NotFound desc = could not find container \"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e\": container with ID starting with 0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e not found: ID does not exist" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.080412 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.099827 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.113710 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:28 crc kubenswrapper[4962]: E0220 10:17:28.114114 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.114131 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" Feb 20 10:17:28 crc kubenswrapper[4962]: E0220 10:17:28.114157 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.114165 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.114347 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.114380 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.115357 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.118457 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.119561 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.121233 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.224766 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.224827 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlgvc\" (UniqueName: \"kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.224931 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.224956 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.224986 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.327908 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.329996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.330070 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.330294 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.330371 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlgvc\" (UniqueName: \"kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.332362 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.334996 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.336150 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.338735 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.352799 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlgvc\" (UniqueName: \"kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.437786 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:28.919829 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:29 crc kubenswrapper[4962]: W0220 10:17:28.927967 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca793428_98ed_4f82_aa57_31d6671d546c.slice/crio-814b8f37484da31723bf086a4604103ef52cd7ea4f8156d43acda95faab765f4 WatchSource:0}: Error finding container 814b8f37484da31723bf086a4604103ef52cd7ea4f8156d43acda95faab765f4: Status 404 returned error can't find the container with id 814b8f37484da31723bf086a4604103ef52cd7ea4f8156d43acda95faab765f4 Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:29.011660 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcd02115-2eb9-4090-8225-108c3a8cad20","Type":"ContainerStarted","Data":"24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666"} Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:29.011731 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcd02115-2eb9-4090-8225-108c3a8cad20","Type":"ContainerStarted","Data":"a6be5e2b469a4dd84e09bc3f569eccb10479b9448520269901b4d42cca661dde"} Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:29.014624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerStarted","Data":"814b8f37484da31723bf086a4604103ef52cd7ea4f8156d43acda95faab765f4"} Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:29.044488 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.044465055 podStartE2EDuration="2.044465055s" podCreationTimestamp="2026-02-20 10:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:29.037330455 +0000 UTC m=+1340.619802341" watchObservedRunningTime="2026-02-20 10:17:29.044465055 +0000 UTC m=+1340.626936901" Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:29.166295 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" path="/var/lib/kubelet/pods/bf680b24-e6dc-40a4-9ee4-521343fd9a28/volumes" Feb 20 10:17:30 crc kubenswrapper[4962]: I0220 10:17:30.035546 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerStarted","Data":"2fdedd716304d48ca972e72c6c0a4e94560cd57ce8c5b0409e88600b50604c0b"} Feb 20 10:17:30 crc kubenswrapper[4962]: I0220 10:17:30.036145 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerStarted","Data":"fdc035dec22a8cb1cbe15ddbb643e583e6ad19e8deec930029ff3031763b1c89"} Feb 20 10:17:30 crc kubenswrapper[4962]: I0220 10:17:30.079288 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.07925126 podStartE2EDuration="2.07925126s" podCreationTimestamp="2026-02-20 10:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:30.075190226 +0000 UTC m=+1341.657662152" watchObservedRunningTime="2026-02-20 10:17:30.07925126 +0000 UTC m=+1341.661723167" Feb 20 10:17:32 crc kubenswrapper[4962]: I0220 10:17:32.440334 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 10:17:33 crc kubenswrapper[4962]: I0220 10:17:33.438366 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 10:17:33 crc kubenswrapper[4962]: I0220 10:17:33.438882 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 10:17:35 crc kubenswrapper[4962]: I0220 10:17:35.656401 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:17:35 crc kubenswrapper[4962]: I0220 10:17:35.656925 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:17:36 crc kubenswrapper[4962]: I0220 10:17:36.672835 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:17:36 crc kubenswrapper[4962]: I0220 10:17:36.672907 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:17:37 crc kubenswrapper[4962]: I0220 10:17:37.440889 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 10:17:37 crc kubenswrapper[4962]: I0220 10:17:37.475655 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 10:17:38 crc kubenswrapper[4962]: I0220 10:17:38.195165 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 10:17:38 crc kubenswrapper[4962]: I0220 10:17:38.438354 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 10:17:38 crc kubenswrapper[4962]: I0220 10:17:38.438434 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 10:17:39 crc kubenswrapper[4962]: I0220 10:17:39.450746 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:17:39 crc kubenswrapper[4962]: I0220 10:17:39.450825 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:17:44 crc kubenswrapper[4962]: I0220 10:17:44.266157 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 10:17:45 crc kubenswrapper[4962]: I0220 10:17:45.671893 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 10:17:45 crc kubenswrapper[4962]: I0220 10:17:45.672880 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 10:17:45 crc kubenswrapper[4962]: I0220 10:17:45.674695 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 10:17:45 crc kubenswrapper[4962]: I0220 10:17:45.683881 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 10:17:46 crc kubenswrapper[4962]: I0220 10:17:46.271425 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 10:17:46 crc kubenswrapper[4962]: I0220 10:17:46.281410 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 10:17:48 crc kubenswrapper[4962]: I0220 10:17:48.448998 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 10:17:48 crc kubenswrapper[4962]: I0220 10:17:48.452533 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 10:17:48 crc kubenswrapper[4962]: I0220 10:17:48.459795 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 10:17:49 crc kubenswrapper[4962]: I0220 10:17:49.316088 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 10:18:08 crc kubenswrapper[4962]: I0220 10:18:08.847732 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 20 10:18:08 crc kubenswrapper[4962]: I0220 10:18:08.849644 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="755ca463-8c62-402c-8a88-a066fb38b521" containerName="openstackclient" containerID="cri-o://58314faa8bcfe5f5f7afbcc99e392370d5f2737c5567814db10eda41512d6621" gracePeriod=2 Feb 20 10:18:08 crc kubenswrapper[4962]: I0220 10:18:08.888024 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.000483 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:09 crc kubenswrapper[4962]: E0220 10:18:09.000984 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755ca463-8c62-402c-8a88-a066fb38b521" containerName="openstackclient" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.000997 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="755ca463-8c62-402c-8a88-a066fb38b521" containerName="openstackclient" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.001187 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="755ca463-8c62-402c-8a88-a066fb38b521" containerName="openstackclient" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.015044 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.025456 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.041677 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ptczd"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.077231 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.082343 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.082425 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qvff\" (UniqueName: \"kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.119845 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ptczd"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.133916 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c46d-account-create-update-44g6w"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.183505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.183922 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qvff\" (UniqueName: \"kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.184824 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.193766 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598e051e-58af-4a1a-aa46-7f88d635f34c" path="/var/lib/kubelet/pods/598e051e-58af-4a1a-aa46-7f88d635f34c/volumes" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.194532 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c46d-account-create-update-44g6w"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.271781 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.273351 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.303451 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.303726 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cwxt\" (UniqueName: \"kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.309098 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.322435 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qvff\" (UniqueName: \"kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.323658 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-11b3-account-create-update-x5n92"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.369613 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.412359 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.412502 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cwxt\" (UniqueName: \"kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.413320 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-11b3-account-create-update-x5n92"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.413632 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.460128 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.525177 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.528100 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cwxt\" (UniqueName: \"kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.582704 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.603402 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.603715 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-k7csj" podUID="88c21489-524e-4ee7-a340-5be2573af161" containerName="openstack-network-exporter" containerID="cri-o://c9e1c05611f8961e024087e0e04491e46e765acba8a5cc8a2a36a27876de28c3" gracePeriod=30 Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.622670 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.623228 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="ovn-northd" containerID="cri-o://095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" gracePeriod=30 Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.623479 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="openstack-network-exporter" containerID="cri-o://0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453" gracePeriod=30 Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.695046 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:18:09 crc kubenswrapper[4962]: E0220 10:18:09.735706 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:09 crc kubenswrapper[4962]: E0220 10:18:09.735790 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data podName:56a77dd3-ef10-46a6-a00d-ab38af0d4338 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:10.235764585 +0000 UTC m=+1381.818236431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data") pod "rabbitmq-cell1-server-0" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338") : configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.835355 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.904520 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7588-account-create-update-6ttfz"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.960907 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7588-account-create-update-6ttfz"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.146898 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mk67n"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.177841 4962 generic.go:334] "Generic (PLEG): container finished" podID="33d73a04-08b2-4944-861f-749a63c2565d" containerID="0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453" exitCode=2 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.177986 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerDied","Data":"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453"} Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.186585 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mk67n"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.212450 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-k7csj_88c21489-524e-4ee7-a340-5be2573af161/openstack-network-exporter/0.log" Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.212502 4962 generic.go:334] "Generic (PLEG): container finished" podID="88c21489-524e-4ee7-a340-5be2573af161" containerID="c9e1c05611f8961e024087e0e04491e46e765acba8a5cc8a2a36a27876de28c3" exitCode=2 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.212536 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k7csj" event={"ID":"88c21489-524e-4ee7-a340-5be2573af161","Type":"ContainerDied","Data":"c9e1c05611f8961e024087e0e04491e46e765acba8a5cc8a2a36a27876de28c3"} Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.221885 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e96c-account-create-update-zd8bf"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.273721 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-s4qgr"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.313761 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e96c-account-create-update-zd8bf"] Feb 20 10:18:10 crc kubenswrapper[4962]: E0220 10:18:10.316773 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:10 crc kubenswrapper[4962]: E0220 10:18:10.316838 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data podName:56a77dd3-ef10-46a6-a00d-ab38af0d4338 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:11.316816615 +0000 UTC m=+1382.899288461 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data") pod "rabbitmq-cell1-server-0" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338") : configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.361653 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-smcqr"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.403778 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-s4qgr"] Feb 20 10:18:10 crc kubenswrapper[4962]: E0220 10:18:10.457885 4962 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-wj9f6" message=< Feb 20 10:18:10 crc kubenswrapper[4962]: Exiting ovn-controller (1) [ OK ] Feb 20 10:18:10 crc kubenswrapper[4962]: > Feb 20 10:18:10 crc kubenswrapper[4962]: E0220 10:18:10.457933 4962 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-wj9f6" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" containerID="cri-o://d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8" Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.457976 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-wj9f6" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" containerID="cri-o://d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8" gracePeriod=30 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.458413 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-695f-account-create-update-22t44"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.523695 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-smcqr"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.598717 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-695f-account-create-update-22t44"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.621683 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.622521 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="openstack-network-exporter" containerID="cri-o://a2580fff2ba1ecc29418d1a47b14ce5d8459c470e24eee4d2ebced1a648dc3a8" gracePeriod=300 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.657152 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.657694 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="dnsmasq-dns" containerID="cri-o://337cc3322a86ac4051b60ce8c7418dd0f1ccf4eafea40f3e9c75cc1f12e67b28" gracePeriod=10 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.704942 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7729-account-create-update-dttxs"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.792558 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7729-account-create-update-dttxs"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.809917 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9gcrq"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.888567 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9gcrq"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.903569 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="ovsdbserver-sb" containerID="cri-o://9a823554a8f72450a8956f74b11a494798fb5f7fc99300ed38421760066cc712" gracePeriod=300 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.919581 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-v7sjh"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.923412 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-v7sjh"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.952619 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.953362 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="openstack-network-exporter" containerID="cri-o://e9de55d709a0309b4fcbcb74a44dfc77cc45f95d7066591c4a40dc2b0ceb9eed" gracePeriod=300 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.978468 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.978761 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-687f4cff74-gmh4w" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-log" containerID="cri-o://4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53" gracePeriod=30 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.981259 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-687f4cff74-gmh4w" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-api" containerID="cri-o://eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548" gracePeriod=30 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.990784 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9mznb"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.030327 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.030581 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="cinder-scheduler" containerID="cri-o://f20b981aacdf6de658de3f762f39158362f94f8752f0a75fc0ae9dfa445ad0b1" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.034029 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="probe" containerID="cri-o://c0eb68155798173ab5bc0e3d87fda35f3734305779104c33299016d17b9b3def" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.084685 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-9mznb"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.140102 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mm68z"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.232905 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="ovsdbserver-nb" containerID="cri-o://b7ff4938197d4ffeb1d0dead4cb76392b4c2fbfcd796b8766f3dbd1e8efbaf48" gracePeriod=300 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.295480 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" path="/var/lib/kubelet/pods/14c237ea-eb42-49d4-90db-ee57e3b560e3/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.296865 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21296df9-6e67-4427-959d-8d67bfd1393b" path="/var/lib/kubelet/pods/21296df9-6e67-4427-959d-8d67bfd1393b/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.297839 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e7338a7-4012-439d-b961-6ca0c55dd6e6" path="/var/lib/kubelet/pods/2e7338a7-4012-439d-b961-6ca0c55dd6e6/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.298425 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" path="/var/lib/kubelet/pods/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.302001 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4feedd65-778f-471c-a2bf-23af2e459685" path="/var/lib/kubelet/pods/4feedd65-778f-471c-a2bf-23af2e459685/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.302750 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b114dbd-1f72-42c9-97c1-43795d1cf1ea" path="/var/lib/kubelet/pods/6b114dbd-1f72-42c9-97c1-43795d1cf1ea/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.305055 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2005e0-31d4-408f-8c66-187a6dd37bcd" path="/var/lib/kubelet/pods/7e2005e0-31d4-408f-8c66-187a6dd37bcd/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.351159 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.382059 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84f50d98-6178-44d4-8ac4-43a8df4e3339" path="/var/lib/kubelet/pods/84f50d98-6178-44d4-8ac4-43a8df4e3339/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.385179 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85565888-6622-4dfc-9198-8e9c5b05cc75" path="/var/lib/kubelet/pods/85565888-6622-4dfc-9198-8e9c5b05cc75/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.386090 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e25820-62eb-4ad9-92ad-471c2f0f7ed4" path="/var/lib/kubelet/pods/97e25820-62eb-4ad9-92ad-471c2f0f7ed4/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.387415 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afbf9dd3-3bb5-4908-aad0-d06f09946e17" path="/var/lib/kubelet/pods/afbf9dd3-3bb5-4908-aad0-d06f09946e17/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.388428 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_719faf26-7700-4eff-9dca-0a4ec3c51344/ovsdbserver-sb/0.log" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.388544 4962 generic.go:334] "Generic (PLEG): container finished" podID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerID="a2580fff2ba1ecc29418d1a47b14ce5d8459c470e24eee4d2ebced1a648dc3a8" exitCode=2 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.388701 4962 generic.go:334] "Generic (PLEG): container finished" podID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerID="9a823554a8f72450a8956f74b11a494798fb5f7fc99300ed38421760066cc712" exitCode=143 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.388895 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" path="/var/lib/kubelet/pods/d970dac6-1948-42dd-b5d9-c5df1b04e30d/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.392805 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a33d-account-create-update-6q8g4"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.392857 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mm68z"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6f6vb" event={"ID":"812fea74-e4e5-4550-8a20-8fe04752a016","Type":"ContainerStarted","Data":"b35105a6f1f09300973fb51f5cc2ceed7e4acc42cd81be4a5215ef08b873fcd8"} Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393055 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-a33d-account-create-update-6q8g4"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393096 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerDied","Data":"a2580fff2ba1ecc29418d1a47b14ce5d8459c470e24eee4d2ebced1a648dc3a8"} Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393158 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerDied","Data":"9a823554a8f72450a8956f74b11a494798fb5f7fc99300ed38421760066cc712"} Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393179 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjq4f"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393205 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjq4f"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393222 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393251 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393277 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393292 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-h5ptn"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393307 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-h5ptn"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393337 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393356 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393376 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-zfmzb"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393391 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d8b3-account-create-update-br2xj"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393404 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-zfmzb"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.394038 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-log" containerID="cri-o://2063db6c0681c99c5af22bd280759565fe6f153460080e6e822a7af9e9e7ff12" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.394426 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-httpd" containerID="cri-o://fd4f315997ddf00a356a9ec5e5c2864b8fa25408200a3b8ba03172b2cebc87ed" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.395818 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api" containerID="cri-o://7c19f6ab819e8b088592bd7831817812900bca1c0cc3649a9662bfcc1aa1ae48" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.395977 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5dfd6b5f7f-dkfsl" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-httpd" containerID="cri-o://731c2e1dae94781e12c80ac05ffd0b3634529739ec574c2b3459d53ff4dd175f" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.395581 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api-log" containerID="cri-o://aa52f40e409ac825205d183f70f7cf56df81e106f777a2fe46a3166fb938361b" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.397973 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5dfd6b5f7f-dkfsl" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-api" containerID="cri-o://45e715a9f15469232fd9eda659480065c452b6d474e0d50459f16eb16fcf18e3" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.401652 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-server" containerID="cri-o://b6ead0e1bdda64a7399139dd6191cc696b570349bf204a2ab46ce0d182cc49a9" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.401855 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="swift-recon-cron" containerID="cri-o://63c4d35ae203bd5ac342fa6d490352730d135f847a680bbe15aae0fe53059141" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.401921 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="rsync" containerID="cri-o://3c297c5e3426f0b38076ba12a36de8e42599c1ec9b371d1d4ac3dc87d286fdac" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.402017 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-expirer" containerID="cri-o://6460038d74df47b4bd5e8f877737b675fdcc51257f17732080e42ee0a1e7dfa6" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.402348 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-auditor" containerID="cri-o://3f89270dd151567356dcd4569c268792d8ce043f1e81df07ebe5f55f65531bca" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.402418 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-replicator" containerID="cri-o://05aae6f36e27022f7b4fa526f1265b47aeb3c166ab95c682c5b8f4ac82205eff" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.402077 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-updater" containerID="cri-o://3108da3bf591571013cc25e1b8f1de0c827e10b04d9686bc5e1fb47bc9778731" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.402618 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-server" containerID="cri-o://6727a65f145335bf540a7898aeabecb549d8d22b6c9a1c79a91620a5e8e3e3f8" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404247 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-auditor" containerID="cri-o://5d9d68ccd50ca26ce3191d56dc735011eb169a68e6eedc3144c97564be0ff601" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404389 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-reaper" containerID="cri-o://1b0e56a8482d960b0917a1f3004c6a015099a8313a0f5c4fbb4d166f9d4ea11c" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404473 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-replicator" containerID="cri-o://066dce8eb5ee2a5ee4696fbdc5642875edc121ec4465ea32468ecf8aba5fbe36" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404611 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-auditor" containerID="cri-o://4bc06842128d6fdcb6b37354d4c5aad1c3642acbd05e513b28a95e6f19bab1ca" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404669 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-updater" containerID="cri-o://87c786369d8da7650fca3be3c67f9a8decb0d8fd88429ab357e31f9e7c19f3e0" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404732 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-replicator" containerID="cri-o://8395eb871539c46360c6d66fb96850aeed91819306e7873acf83b98b89a956d8" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.439113 4962 generic.go:334] "Generic (PLEG): container finished" podID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerID="337cc3322a86ac4051b60ce8c7418dd0f1ccf4eafea40f3e9c75cc1f12e67b28" exitCode=0 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.439269 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" event={"ID":"2e4f70a2-b8ae-48cc-a098-5642fad8b040","Type":"ContainerDied","Data":"337cc3322a86ac4051b60ce8c7418dd0f1ccf4eafea40f3e9c75cc1f12e67b28"} Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.458628 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-server" containerID="cri-o://138e05b5e05f4d5ae28d62c69c931e5b6907fd9792450f37e652add9de1e83a1" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.460302 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.460349 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data podName:56a77dd3-ef10-46a6-a00d-ab38af0d4338 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:13.460331583 +0000 UTC m=+1385.042803419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data") pod "rabbitmq-cell1-server-0" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338") : configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.475295 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" containerID="cri-o://fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" gracePeriod=29 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.475575 4962 generic.go:334] "Generic (PLEG): container finished" podID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerID="4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53" exitCode=143 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.475648 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerDied","Data":"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53"} Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.512416 4962 generic.go:334] "Generic (PLEG): container finished" podID="755ca463-8c62-402c-8a88-a066fb38b521" containerID="58314faa8bcfe5f5f7afbcc99e392370d5f2737c5567814db10eda41512d6621" exitCode=137 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.515842 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d8b3-account-create-update-br2xj"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.537658 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2m8r7"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.546000 4962 generic.go:334] "Generic (PLEG): container finished" podID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerID="d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8" exitCode=0 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.546483 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6" event={"ID":"383d4f1e-72b3-48ce-9427-0361c19e41fc","Type":"ContainerDied","Data":"d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8"} Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.578152 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.578228 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data podName:2a8d652d-aea8-4a83-b33e-0d2522af0be8 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:12.078204816 +0000 UTC m=+1383.660676662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data") pod "rabbitmq-server-0" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8") : configmap "rabbitmq-config-data" not found Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.585984 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2m8r7"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.588819 4962 generic.go:334] "Generic (PLEG): container finished" podID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerID="e9de55d709a0309b4fcbcb74a44dfc77cc45f95d7066591c4a40dc2b0ceb9eed" exitCode=2 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.588875 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerDied","Data":"e9de55d709a0309b4fcbcb74a44dfc77cc45f95d7066591c4a40dc2b0ceb9eed"} Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.623440 4962 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 20 10:18:11 crc kubenswrapper[4962]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 10:18:11 crc kubenswrapper[4962]: + source /usr/local/bin/container-scripts/functions Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNBridge=br-int Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNRemote=tcp:localhost:6642 Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNEncapType=geneve Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNAvailabilityZones= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ EnableChassisAsGateway=true Feb 20 10:18:11 crc kubenswrapper[4962]: ++ PhysicalNetworks= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNHostName= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 10:18:11 crc kubenswrapper[4962]: ++ ovs_dir=/var/lib/openvswitch Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 10:18:11 crc kubenswrapper[4962]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + cleanup_ovsdb_server_semaphore Feb 20 10:18:11 crc kubenswrapper[4962]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 10:18:11 crc kubenswrapper[4962]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-r7g9h" message=< Feb 20 10:18:11 crc kubenswrapper[4962]: Exiting ovsdb-server (5) [ OK ] Feb 20 10:18:11 crc kubenswrapper[4962]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 10:18:11 crc kubenswrapper[4962]: + source /usr/local/bin/container-scripts/functions Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNBridge=br-int Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNRemote=tcp:localhost:6642 Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNEncapType=geneve Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNAvailabilityZones= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ EnableChassisAsGateway=true Feb 20 10:18:11 crc kubenswrapper[4962]: ++ PhysicalNetworks= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNHostName= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 10:18:11 crc kubenswrapper[4962]: ++ ovs_dir=/var/lib/openvswitch Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 10:18:11 crc kubenswrapper[4962]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + cleanup_ovsdb_server_semaphore Feb 20 10:18:11 crc kubenswrapper[4962]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 10:18:11 crc kubenswrapper[4962]: > Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.623503 4962 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 20 10:18:11 crc kubenswrapper[4962]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 10:18:11 crc kubenswrapper[4962]: + source /usr/local/bin/container-scripts/functions Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNBridge=br-int Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNRemote=tcp:localhost:6642 Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNEncapType=geneve Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNAvailabilityZones= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ EnableChassisAsGateway=true Feb 20 10:18:11 crc kubenswrapper[4962]: ++ PhysicalNetworks= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNHostName= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 10:18:11 crc kubenswrapper[4962]: ++ ovs_dir=/var/lib/openvswitch Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 10:18:11 crc kubenswrapper[4962]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + cleanup_ovsdb_server_semaphore Feb 20 10:18:11 crc kubenswrapper[4962]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 10:18:11 crc kubenswrapper[4962]: > pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" containerID="cri-o://0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.623542 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" containerID="cri-o://0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" gracePeriod=28 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.628889 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-k7csj_88c21489-524e-4ee7-a340-5be2573af161/openstack-network-exporter/0.log" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.628944 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.638434 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.645579 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-httpd" containerID="cri-o://f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.646136 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-log" containerID="cri-o://c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.699484 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.699769 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b8479d945-8wsh9" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker-log" containerID="cri-o://1a117c325a572e0a4fee70e6f72cca84b0d93bdf09ce042ac50994ca64fd3520" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.700194 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b8479d945-8wsh9" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker" containerID="cri-o://d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.704184 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.718712 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4hwp2"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.725841 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4hwp2"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.731495 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.731758 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener-log" containerID="cri-o://6cbbafaf6ad06d0f58cf79b2da64a294b16c2b2e6931344860d8ecda539fe7b2" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.731880 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener" containerID="cri-o://5debc339fcb891cc07e7fa0a7db99fb7f297c28473a143743938f4792107d27c" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.745291 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.745675 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-log" containerID="cri-o://d6bf8640027e8b75225f36e2b4a5d790818a0e4259c4c5012d627c79a493efb3" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.745862 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-api" containerID="cri-o://42f33c3ac4e84257c4f38d060186abe1300d7dfb20f8894c1b519bb38d1529c9" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.753622 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.767061 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.767301 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84464996cb-fhnvz" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api-log" containerID="cri-o://a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.768284 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84464996cb-fhnvz" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api" containerID="cri-o://e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.774215 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783207 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783382 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783477 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783499 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783555 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783675 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxgrm\" (UniqueName: \"kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.784692 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.785012 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" containerID="cri-o://fdc035dec22a8cb1cbe15ddbb643e583e6ad19e8deec930029ff3031763b1c89" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.785627 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" containerID="cri-o://2fdedd716304d48ca972e72c6c0a4e94560cd57ce8c5b0409e88600b50604c0b" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.786678 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.788135 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config" (OuterVolumeSpecName: "config") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.788202 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.798563 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-758kd"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.808933 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-758kd"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.823102 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.834839 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.854177 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm" (OuterVolumeSpecName: "kube-api-access-wxgrm") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "kube-api-access-wxgrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888645 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888717 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l52h9\" (UniqueName: \"kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888794 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888865 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888911 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888974 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.889027 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.893095 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.893137 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894188 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run" (OuterVolumeSpecName: "var-run") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894350 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894362 4962 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894374 4962 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894385 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxgrm\" (UniqueName: \"kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894396 4962 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894406 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894415 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.895146 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts" (OuterVolumeSpecName: "scripts") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.906185 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9" (OuterVolumeSpecName: "kube-api-access-l52h9") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "kube-api-access-l52h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.929743 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tbn8g"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.939329 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tbn8g"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.944263 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.948758 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-xnwmz"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.957955 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-xnwmz"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.966032 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.966863 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://aa9b7812a81805d3c1d048e75378c2e89e7f075bbe36af5665b4416075da7b83" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.980387 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9xxwl"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.996922 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9xxwl"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.998519 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l52h9\" (UniqueName: \"kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.998549 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.998559 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.008723 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.009019 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" containerID="cri-o://24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" gracePeriod=30 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.018574 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.018665 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.052896 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.072233 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="rabbitmq" containerID="cri-o://89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718" gracePeriod=604800 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.108360 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.108397 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.108674 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.108738 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data podName:2a8d652d-aea8-4a83-b33e-0d2522af0be8 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:13.108719175 +0000 UTC m=+1384.691191011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data") pod "rabbitmq-server-0" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8") : configmap "rabbitmq-config-data" not found Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.148820 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.164438 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.212833 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.262132 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.278801 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="galera" containerID="cri-o://a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6" gracePeriod=30 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.314766 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdngd\" (UniqueName: \"kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd\") pod \"755ca463-8c62-402c-8a88-a066fb38b521\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.314891 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret\") pod \"755ca463-8c62-402c-8a88-a066fb38b521\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.315041 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") pod \"755ca463-8c62-402c-8a88-a066fb38b521\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.315182 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle\") pod \"755ca463-8c62-402c-8a88-a066fb38b521\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.340047 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd" (OuterVolumeSpecName: "kube-api-access-qdngd") pod "755ca463-8c62-402c-8a88-a066fb38b521" (UID: "755ca463-8c62-402c-8a88-a066fb38b521"). InnerVolumeSpecName "kube-api-access-qdngd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.417355 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "755ca463-8c62-402c-8a88-a066fb38b521" (UID: "755ca463-8c62-402c-8a88-a066fb38b521"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.417642 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.418024 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.418274 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") pod \"755ca463-8c62-402c-8a88-a066fb38b521\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.418946 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.419569 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.419815 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.423335 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nss8l\" (UniqueName: \"kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: W0220 10:18:12.418881 4962 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/755ca463-8c62-402c-8a88-a066fb38b521/volumes/kubernetes.io~configmap/openstack-config Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.428836 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "755ca463-8c62-402c-8a88-a066fb38b521" (UID: "755ca463-8c62-402c-8a88-a066fb38b521"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.432446 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdngd\" (UniqueName: \"kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.432475 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.422991 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.440968 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerName="nova-cell1-conductor-conductor" containerID="cri-o://2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" gracePeriod=30 Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.491168 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.502718 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.506292 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.506464 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.509921 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "755ca463-8c62-402c-8a88-a066fb38b521" (UID: "755ca463-8c62-402c-8a88-a066fb38b521"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.513824 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l" (OuterVolumeSpecName: "kube-api-access-nss8l") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "kube-api-access-nss8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.544199 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.544238 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nss8l\" (UniqueName: \"kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.549017 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ct4qz"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.562578 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ct4qz"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.567397 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbq67"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.577098 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbq67"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.587229 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.587496 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerName="nova-cell0-conductor-conductor" containerID="cri-o://5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" gracePeriod=30 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614799 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="3c297c5e3426f0b38076ba12a36de8e42599c1ec9b371d1d4ac3dc87d286fdac" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614831 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="6460038d74df47b4bd5e8f877737b675fdcc51257f17732080e42ee0a1e7dfa6" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614839 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="3108da3bf591571013cc25e1b8f1de0c827e10b04d9686bc5e1fb47bc9778731" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614847 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="3f89270dd151567356dcd4569c268792d8ce043f1e81df07ebe5f55f65531bca" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614855 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="05aae6f36e27022f7b4fa526f1265b47aeb3c166ab95c682c5b8f4ac82205eff" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614862 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="b6ead0e1bdda64a7399139dd6191cc696b570349bf204a2ab46ce0d182cc49a9" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614869 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="87c786369d8da7650fca3be3c67f9a8decb0d8fd88429ab357e31f9e7c19f3e0" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614877 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="4bc06842128d6fdcb6b37354d4c5aad1c3642acbd05e513b28a95e6f19bab1ca" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614883 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="8395eb871539c46360c6d66fb96850aeed91819306e7873acf83b98b89a956d8" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614890 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="6727a65f145335bf540a7898aeabecb549d8d22b6c9a1c79a91620a5e8e3e3f8" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614897 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="1b0e56a8482d960b0917a1f3004c6a015099a8313a0f5c4fbb4d166f9d4ea11c" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614903 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="5d9d68ccd50ca26ce3191d56dc735011eb169a68e6eedc3144c97564be0ff601" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614909 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="066dce8eb5ee2a5ee4696fbdc5642875edc121ec4465ea32468ecf8aba5fbe36" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614916 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="138e05b5e05f4d5ae28d62c69c931e5b6907fd9792450f37e652add9de1e83a1" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614963 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"3c297c5e3426f0b38076ba12a36de8e42599c1ec9b371d1d4ac3dc87d286fdac"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614990 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"6460038d74df47b4bd5e8f877737b675fdcc51257f17732080e42ee0a1e7dfa6"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"3108da3bf591571013cc25e1b8f1de0c827e10b04d9686bc5e1fb47bc9778731"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615011 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"3f89270dd151567356dcd4569c268792d8ce043f1e81df07ebe5f55f65531bca"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615020 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"05aae6f36e27022f7b4fa526f1265b47aeb3c166ab95c682c5b8f4ac82205eff"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615028 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"b6ead0e1bdda64a7399139dd6191cc696b570349bf204a2ab46ce0d182cc49a9"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615036 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"87c786369d8da7650fca3be3c67f9a8decb0d8fd88429ab357e31f9e7c19f3e0"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615045 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"4bc06842128d6fdcb6b37354d4c5aad1c3642acbd05e513b28a95e6f19bab1ca"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615053 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"8395eb871539c46360c6d66fb96850aeed91819306e7873acf83b98b89a956d8"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"6727a65f145335bf540a7898aeabecb549d8d22b6c9a1c79a91620a5e8e3e3f8"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615070 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"1b0e56a8482d960b0917a1f3004c6a015099a8313a0f5c4fbb4d166f9d4ea11c"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615080 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"5d9d68ccd50ca26ce3191d56dc735011eb169a68e6eedc3144c97564be0ff601"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615089 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"066dce8eb5ee2a5ee4696fbdc5642875edc121ec4465ea32468ecf8aba5fbe36"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615100 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"138e05b5e05f4d5ae28d62c69c931e5b6907fd9792450f37e652add9de1e83a1"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.616899 4962 generic.go:334] "Generic (PLEG): container finished" podID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerID="2063db6c0681c99c5af22bd280759565fe6f153460080e6e822a7af9e9e7ff12" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.616944 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerDied","Data":"2063db6c0681c99c5af22bd280759565fe6f153460080e6e822a7af9e9e7ff12"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.618067 4962 generic.go:334] "Generic (PLEG): container finished" podID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerID="6cbbafaf6ad06d0f58cf79b2da64a294b16c2b2e6931344860d8ecda539fe7b2" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.618101 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerDied","Data":"6cbbafaf6ad06d0f58cf79b2da64a294b16c2b2e6931344860d8ecda539fe7b2"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.663159 4962 generic.go:334] "Generic (PLEG): container finished" podID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.663281 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerDied","Data":"0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.710879 4962 generic.go:334] "Generic (PLEG): container finished" podID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerID="a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.710988 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerDied","Data":"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.728870 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "755ca463-8c62-402c-8a88-a066fb38b521" (UID: "755ca463-8c62-402c-8a88-a066fb38b521"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.730176 4962 generic.go:334] "Generic (PLEG): container finished" podID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerID="c0eb68155798173ab5bc0e3d87fda35f3734305779104c33299016d17b9b3def" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.730232 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerDied","Data":"c0eb68155798173ab5bc0e3d87fda35f3734305779104c33299016d17b9b3def"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.731672 4962 generic.go:334] "Generic (PLEG): container finished" podID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerID="731c2e1dae94781e12c80ac05ffd0b3634529739ec574c2b3459d53ff4dd175f" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.731709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerDied","Data":"731c2e1dae94781e12c80ac05ffd0b3634529739ec574c2b3459d53ff4dd175f"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.733248 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.752286 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" event={"ID":"2e4f70a2-b8ae-48cc-a098-5642fad8b040","Type":"ContainerDied","Data":"43990fb41b7e9e93f7abc5c81e14ddd0fcd4df0bd08b0a99fd55dd59749b0c05"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.752344 4962 scope.go:117] "RemoveContainer" containerID="337cc3322a86ac4051b60ce8c7418dd0f1ccf4eafea40f3e9c75cc1f12e67b28" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.752476 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.789241 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config" (OuterVolumeSpecName: "config") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.789426 4962 generic.go:334] "Generic (PLEG): container finished" podID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerID="d6bf8640027e8b75225f36e2b4a5d790818a0e4259c4c5012d627c79a493efb3" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.789489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerDied","Data":"d6bf8640027e8b75225f36e2b4a5d790818a0e4259c4c5012d627c79a493efb3"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.821465 4962 generic.go:334] "Generic (PLEG): container finished" podID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerID="aa52f40e409ac825205d183f70f7cf56df81e106f777a2fe46a3166fb938361b" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.821573 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerDied","Data":"aa52f40e409ac825205d183f70f7cf56df81e106f777a2fe46a3166fb938361b"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.838205 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.840318 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.840332 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.846522 4962 generic.go:334] "Generic (PLEG): container finished" podID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerID="c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.846579 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerDied","Data":"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.855818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.865011 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.879082 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_719faf26-7700-4eff-9dca-0a4ec3c51344/ovsdbserver-sb/0.log" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.879371 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.881916 4962 generic.go:334] "Generic (PLEG): container finished" podID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerID="b7ff4938197d4ffeb1d0dead4cb76392b4c2fbfcd796b8766f3dbd1e8efbaf48" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.882024 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerDied","Data":"b7ff4938197d4ffeb1d0dead4cb76392b4c2fbfcd796b8766f3dbd1e8efbaf48"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.900612 4962 scope.go:117] "RemoveContainer" containerID="b6772b9162a6a32cfbe3b48349f45c3e39e34e153494f4b09b124b0a0f86db0c" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.901318 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.909243 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f35bada-015d-4051-9976-d5dfe3a93216" containerID="1a117c325a572e0a4fee70e6f72cca84b0d93bdf09ce042ac50994ca64fd3520" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.909329 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerDied","Data":"1a117c325a572e0a4fee70e6f72cca84b0d93bdf09ce042ac50994ca64fd3520"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.912260 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.912400 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k7csj" event={"ID":"88c21489-524e-4ee7-a340-5be2573af161","Type":"ContainerDied","Data":"f6763fa902e28879cc4359d1b1acc4ff238f733e4bd8236ae411565bdfb3ac57"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.915353 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.935679 4962 generic.go:334] "Generic (PLEG): container finished" podID="ca793428-98ed-4f82-aa57-31d6671d546c" containerID="fdc035dec22a8cb1cbe15ddbb643e583e6ad19e8deec930029ff3031763b1c89" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.935809 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerDied","Data":"fdc035dec22a8cb1cbe15ddbb643e583e6ad19e8deec930029ff3031763b1c89"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.946773 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.947804 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.948070 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.948142 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.948306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.949665 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.949714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.949745 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.949939 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsrg9\" (UniqueName: \"kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.949973 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.951150 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.948536 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6" event={"ID":"383d4f1e-72b3-48ce-9427-0361c19e41fc","Type":"ContainerDied","Data":"1ed5bd754fe42b78759f03224b6a39f1b92d8d484574e9a6557ab622debe2a23"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.953972 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.955719 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.955755 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.958539 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts" (OuterVolumeSpecName: "scripts") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.969567 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config" (OuterVolumeSpecName: "config") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.972519 4962 scope.go:117] "RemoveContainer" containerID="c9e1c05611f8961e024087e0e04491e46e765acba8a5cc8a2a36a27876de28c3" Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.973998 4962 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 10:18:12 crc kubenswrapper[4962]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: if [ -n "barbican" ]; then Feb 20 10:18:12 crc kubenswrapper[4962]: GRANT_DATABASE="barbican" Feb 20 10:18:12 crc kubenswrapper[4962]: else Feb 20 10:18:12 crc kubenswrapper[4962]: GRANT_DATABASE="*" Feb 20 10:18:12 crc kubenswrapper[4962]: fi Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: # going for maximum compatibility here: Feb 20 10:18:12 crc kubenswrapper[4962]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 20 10:18:12 crc kubenswrapper[4962]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 20 10:18:12 crc kubenswrapper[4962]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 20 10:18:12 crc kubenswrapper[4962]: # support updates Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: $MYSQL_CMD < logger="UnhandledError" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.974285 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.977011 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-c46d-account-create-update-gfqts" podUID="cca18a27-31bc-440b-a4a9-517b3323bb91" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.991861 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9" (OuterVolumeSpecName: "kube-api-access-qsrg9") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "kube-api-access-qsrg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.992052 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.040010 4962 scope.go:117] "RemoveContainer" containerID="d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.061826 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062649 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062704 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062734 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062783 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd882\" (UniqueName: \"kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062856 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062895 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062955 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.063700 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsrg9\" (UniqueName: \"kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.063717 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.063727 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.063737 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.063757 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.074632 4962 scope.go:117] "RemoveContainer" containerID="58314faa8bcfe5f5f7afbcc99e392370d5f2737c5567814db10eda41512d6621" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.084836 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts" (OuterVolumeSpecName: "scripts") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.085433 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.095667 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.085519 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config" (OuterVolumeSpecName: "config") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.112063 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.115399 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.132286 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882" (OuterVolumeSpecName: "kube-api-access-bd882") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "kube-api-access-bd882". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.166587 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd882\" (UniqueName: \"kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.166629 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.166639 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.166805 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.166905 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.166985 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data podName:2a8d652d-aea8-4a83-b33e-0d2522af0be8 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:15.166963816 +0000 UTC m=+1386.749435662 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data") pod "rabbitmq-server-0" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8") : configmap "rabbitmq-config-data" not found Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.174888 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.191277 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0275d40a-1206-4eb2-96c8-6c516c57bed7" path="/var/lib/kubelet/pods/0275d40a-1206-4eb2-96c8-6c516c57bed7/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.191935 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="032f830f-9636-4783-a048-00f9b7b22a3a" path="/var/lib/kubelet/pods/032f830f-9636-4783-a048-00f9b7b22a3a/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.192939 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" path="/var/lib/kubelet/pods/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.201860 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f853840-0af1-40ee-b11b-a0a62f9f4ebf" path="/var/lib/kubelet/pods/1f853840-0af1-40ee-b11b-a0a62f9f4ebf/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.202521 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20663c25-09a7-4a31-9994-450f507d4ff1" path="/var/lib/kubelet/pods/20663c25-09a7-4a31-9994-450f507d4ff1/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.203117 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28bfacb3-7247-41ad-bf30-47c81427487b" path="/var/lib/kubelet/pods/28bfacb3-7247-41ad-bf30-47c81427487b/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.204248 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b915fcc-cf15-43c3-97c6-bde3a29da796" path="/var/lib/kubelet/pods/2b915fcc-cf15-43c3-97c6-bde3a29da796/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.204788 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" path="/var/lib/kubelet/pods/383d4f1e-72b3-48ce-9427-0361c19e41fc/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.205421 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a7b81e-d4af-478f-b2c3-d21f117ad7ec" path="/var/lib/kubelet/pods/39a7b81e-d4af-478f-b2c3-d21f117ad7ec/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.206433 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="684fc9d7-94f0-418a-b059-e5519e6cd316" path="/var/lib/kubelet/pods/684fc9d7-94f0-418a-b059-e5519e6cd316/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.207094 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755ca463-8c62-402c-8a88-a066fb38b521" path="/var/lib/kubelet/pods/755ca463-8c62-402c-8a88-a066fb38b521/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.207752 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79394db3-1fa2-4b8f-927a-1cf8085f1df4" path="/var/lib/kubelet/pods/79394db3-1fa2-4b8f-927a-1cf8085f1df4/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.208565 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c7420bd-d4ef-4511-acf4-a132ad0a5677" path="/var/lib/kubelet/pods/7c7420bd-d4ef-4511-acf4-a132ad0a5677/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.211144 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7da93993-8b14-45f6-8d0b-8366becc762e" path="/var/lib/kubelet/pods/7da93993-8b14-45f6-8d0b-8366becc762e/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.211857 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" path="/var/lib/kubelet/pods/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.212566 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e761565e-55de-43bc-b82d-95b776652b5c" path="/var/lib/kubelet/pods/e761565e-55de-43bc-b82d-95b776652b5c/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.269105 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.269195 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.269213 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.272219 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.321721 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="rabbitmq" containerID="cri-o://f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308" gracePeriod=604800 Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.353102 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.385371 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.416722 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.491311 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.492992 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.493023 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.493094 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.493146 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data podName:56a77dd3-ef10-46a6-a00d-ab38af0d4338 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:17.493128328 +0000 UTC m=+1389.075600174 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data") pod "rabbitmq-cell1-server-0" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338") : configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.530939 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.540896 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.572782 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.596211 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.596255 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.596267 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.642852 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.693727 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.721110 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.721139 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.865154 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.880611 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.888123 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.888417 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5b685f5b9-4db6w" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-httpd" containerID="cri-o://68a406d2a6eadc4116c120af687c887ef22a20b066ec54d2d6991bd97aaef0e9" gracePeriod=30 Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.888908 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5b685f5b9-4db6w" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-server" containerID="cri-o://42f878706ae1a7e2114a67d56b43328ffc07645b6f77f8f9d20b6c4a2aec6632" gracePeriod=30 Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.904809 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.904890 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="ovn-northd" Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.000667 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod812fea74_e4e5_4550_8a20_8fe04752a016.slice/crio-93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f35bada_015d_4051_9976_d5dfe3a93216.slice/crio-d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod812fea74_e4e5_4550_8a20_8fe04752a016.slice/crio-conmon-93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f35bada_015d_4051_9976_d5dfe3a93216.slice/crio-conmon-d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dd889b7_1b72_4e57_ad0f_85facbad8da4.slice/crio-a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6.scope\": RecentStats: unable to find data in memory cache]" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.001196 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f35bada-015d-4051-9976-d5dfe3a93216" containerID="d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a" exitCode=0 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.001265 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerDied","Data":"d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.002962 4962 generic.go:334] "Generic (PLEG): container finished" podID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" containerID="aa9b7812a81805d3c1d048e75378c2e89e7f075bbe36af5665b4416075da7b83" exitCode=0 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.003012 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00","Type":"ContainerDied","Data":"aa9b7812a81805d3c1d048e75378c2e89e7f075bbe36af5665b4416075da7b83"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.007626 4962 generic.go:334] "Generic (PLEG): container finished" podID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerID="5debc339fcb891cc07e7fa0a7db99fb7f297c28473a143743938f4792107d27c" exitCode=0 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.007685 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerDied","Data":"5debc339fcb891cc07e7fa0a7db99fb7f297c28473a143743938f4792107d27c"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.022529 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_719faf26-7700-4eff-9dca-0a4ec3c51344/ovsdbserver-sb/0.log" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.022647 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerDied","Data":"a2d2a8a63bf5c9ebd610b16b09ca46a05d03ae717f57b9ce876334d685870041"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.022717 4962 scope.go:117] "RemoveContainer" containerID="a2580fff2ba1ecc29418d1a47b14ce5d8459c470e24eee4d2ebced1a648dc3a8" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.022983 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.050315 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.053181 4962 generic.go:334] "Generic (PLEG): container finished" podID="812fea74-e4e5-4550-8a20-8fe04752a016" containerID="93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1" exitCode=1 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.053262 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6f6vb" event={"ID":"812fea74-e4e5-4550-8a20-8fe04752a016","Type":"ContainerDied","Data":"93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.054124 4962 scope.go:117] "RemoveContainer" containerID="93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.084824 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c46d-account-create-update-gfqts" event={"ID":"cca18a27-31bc-440b-a4a9-517b3323bb91","Type":"ContainerStarted","Data":"f6015af78b401355bf39302e0c5756af3b69a15cfa67686aab9f59e8e5466d2c"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.104900 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.108566 4962 generic.go:334] "Generic (PLEG): container finished" podID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerID="f20b981aacdf6de658de3f762f39158362f94f8752f0a75fc0ae9dfa445ad0b1" exitCode=0 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.108726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerDied","Data":"f20b981aacdf6de658de3f762f39158362f94f8752f0a75fc0ae9dfa445ad0b1"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.126139 4962 scope.go:117] "RemoveContainer" containerID="9a823554a8f72450a8956f74b11a494798fb5f7fc99300ed38421760066cc712" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.126437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerDied","Data":"2226a3425cb913ac33dc3114a16db2100facfc7423dff93548d53775b718e6e2"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.126546 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.137785 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bznfn\" (UniqueName: \"kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn\") pod \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.137826 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs\") pod \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.137956 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle\") pod \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.139280 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data\") pod \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.139420 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs\") pod \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.160010 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn" (OuterVolumeSpecName: "kube-api-access-bznfn") pod "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" (UID: "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00"). InnerVolumeSpecName "kube-api-access-bznfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.162329 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.188114 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" (UID: "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.189754 4962 generic.go:334] "Generic (PLEG): container finished" podID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerID="a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6" exitCode=0 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.190234 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerDied","Data":"a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.224128 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data" (OuterVolumeSpecName: "config-data") pod "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" (UID: "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.259417 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.259474 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.259488 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bznfn\" (UniqueName: \"kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.267569 4962 scope.go:117] "RemoveContainer" containerID="e9de55d709a0309b4fcbcb74a44dfc77cc45f95d7066591c4a40dc2b0ceb9eed" Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.267624 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.267797 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.271449 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.271660 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.273112 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.273149 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.273167 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.273233 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.310171 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.315465 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.317613 4962 scope.go:117] "RemoveContainer" containerID="b7ff4938197d4ffeb1d0dead4cb76392b4c2fbfcd796b8766f3dbd1e8efbaf48" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.318579 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.332517 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.358582 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" (UID: "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.386382 4962 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.429390 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" (UID: "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.440606 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-687f4cff74-gmh4w" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.168:8778/\": read tcp 10.217.0.2:35648->10.217.0.168:8778: read: connection reset by peer" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.440606 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-687f4cff74-gmh4w" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.168:8778/\": read tcp 10.217.0.2:35640->10.217.0.168:8778: read: connection reset by peer" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488170 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle\") pod \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488361 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data\") pod \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488424 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs\") pod \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488519 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488555 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488643 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488711 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v96c7\" (UniqueName: \"kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7\") pod \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488745 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488784 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom\") pod \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488823 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5494k\" (UniqueName: \"kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488848 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.489294 4962 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.491985 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.492809 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs" (OuterVolumeSpecName: "logs") pod "28437fcd-377a-4b9e-9a28-e01c21e2ad1f" (UID: "28437fcd-377a-4b9e-9a28-e01c21e2ad1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.521564 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts" (OuterVolumeSpecName: "scripts") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.523771 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.530953 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7" (OuterVolumeSpecName: "kube-api-access-v96c7") pod "28437fcd-377a-4b9e-9a28-e01c21e2ad1f" (UID: "28437fcd-377a-4b9e-9a28-e01c21e2ad1f"). InnerVolumeSpecName "kube-api-access-v96c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.536809 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k" (OuterVolumeSpecName: "kube-api-access-5494k") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "kube-api-access-5494k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.539234 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28437fcd-377a-4b9e-9a28-e01c21e2ad1f" (UID: "28437fcd-377a-4b9e-9a28-e01c21e2ad1f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.576565 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.590926 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28437fcd-377a-4b9e-9a28-e01c21e2ad1f" (UID: "28437fcd-377a-4b9e-9a28-e01c21e2ad1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593073 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593095 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593106 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593114 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593123 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v96c7\" (UniqueName: \"kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593132 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593141 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593153 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5494k\" (UniqueName: \"kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.652821 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.687008 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data" (OuterVolumeSpecName: "config-data") pod "28437fcd-377a-4b9e-9a28-e01c21e2ad1f" (UID: "28437fcd-377a-4b9e-9a28-e01c21e2ad1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.695809 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.695830 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.724854 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data" (OuterVolumeSpecName: "config-data") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.803162 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.869542 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.872930 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.891890 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.900866 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.003803 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.165:8776/healthcheck\": read tcp 10.217.0.2:54310->10.217.0.165:8776: read: connection reset by peer" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008294 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts\") pod \"cca18a27-31bc-440b-a4a9-517b3323bb91\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008410 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008468 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008550 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs\") pod \"7f35bada-015d-4051-9976-d5dfe3a93216\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008769 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008836 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9xjl\" (UniqueName: \"kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008880 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data\") pod \"7f35bada-015d-4051-9976-d5dfe3a93216\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008935 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008967 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008999 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009015 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle\") pod \"7f35bada-015d-4051-9976-d5dfe3a93216\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009086 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009116 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom\") pod \"7f35bada-015d-4051-9976-d5dfe3a93216\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009140 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009174 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009202 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cwxt\" (UniqueName: \"kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt\") pod \"cca18a27-31bc-440b-a4a9-517b3323bb91\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009237 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009265 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009298 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps9mx\" (UniqueName: \"kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009340 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009357 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mbmz\" (UniqueName: \"kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz\") pod \"7f35bada-015d-4051-9976-d5dfe3a93216\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.014108 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs" (OuterVolumeSpecName: "logs") pod "7f35bada-015d-4051-9976-d5dfe3a93216" (UID: "7f35bada-015d-4051-9976-d5dfe3a93216"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.014168 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.014265 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs" (OuterVolumeSpecName: "logs") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.017907 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cca18a27-31bc-440b-a4a9-517b3323bb91" (UID: "cca18a27-31bc-440b-a4a9-517b3323bb91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.020927 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.024130 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts" (OuterVolumeSpecName: "scripts") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.025163 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.028020 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl" (OuterVolumeSpecName: "kube-api-access-s9xjl") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "kube-api-access-s9xjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.036223 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.037184 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.037455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz" (OuterVolumeSpecName: "kube-api-access-6mbmz") pod "7f35bada-015d-4051-9976-d5dfe3a93216" (UID: "7f35bada-015d-4051-9976-d5dfe3a93216"). InnerVolumeSpecName "kube-api-access-6mbmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.042434 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7f35bada-015d-4051-9976-d5dfe3a93216" (UID: "7f35bada-015d-4051-9976-d5dfe3a93216"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.044446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx" (OuterVolumeSpecName: "kube-api-access-ps9mx") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "kube-api-access-ps9mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.050882 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt" (OuterVolumeSpecName: "kube-api-access-5cwxt") pod "cca18a27-31bc-440b-a4a9-517b3323bb91" (UID: "cca18a27-31bc-440b-a4a9-517b3323bb91"). InnerVolumeSpecName "kube-api-access-5cwxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.061231 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:58810->10.217.0.206:8775: read: connection reset by peer" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.061403 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:58814->10.217.0.206:8775: read: connection reset by peer" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.111112 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f35bada-015d-4051-9976-d5dfe3a93216" (UID: "7f35bada-015d-4051-9976-d5dfe3a93216"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113237 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113264 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113307 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113320 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113330 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9xjl\" (UniqueName: \"kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113338 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113347 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113355 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113363 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113372 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113380 4962 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113388 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cwxt\" (UniqueName: \"kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113397 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps9mx\" (UniqueName: \"kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.114019 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mbmz\" (UniqueName: \"kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.114036 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.166132 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" path="/var/lib/kubelet/pods/2e4f70a2-b8ae-48cc-a098-5642fad8b040/volumes" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.171863 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" path="/var/lib/kubelet/pods/719faf26-7700-4eff-9dca-0a4ec3c51344/volumes" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.172672 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" path="/var/lib/kubelet/pods/801fa82d-0f57-4af2-9eec-b6cddac658ab/volumes" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.174088 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88c21489-524e-4ee7-a340-5be2573af161" path="/var/lib/kubelet/pods/88c21489-524e-4ee7-a340-5be2573af161/volumes" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.188882 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data" (OuterVolumeSpecName: "config-data") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.207459 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.211726 4962 generic.go:334] "Generic (PLEG): container finished" podID="812fea74-e4e5-4550-8a20-8fe04752a016" containerID="156621efed4a83b0a1598b9e193e1ba9bb7c448ebc2a41320d1b53c4756507b6" exitCode=1 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.212515 4962 scope.go:117] "RemoveContainer" containerID="156621efed4a83b0a1598b9e193e1ba9bb7c448ebc2a41320d1b53c4756507b6" Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.213070 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-6f6vb_openstack(812fea74-e4e5-4550-8a20-8fe04752a016)\"" pod="openstack/root-account-create-update-6f6vb" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.213887 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.218394 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.218421 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.218431 4962 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.218496 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.218536 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data podName:2a8d652d-aea8-4a83-b33e-0d2522af0be8 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:19.218519641 +0000 UTC m=+1390.800991487 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data") pod "rabbitmq-server-0" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8") : configmap "rabbitmq-config-data" not found Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.240035 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6f6vb" event={"ID":"812fea74-e4e5-4550-8a20-8fe04752a016","Type":"ContainerDied","Data":"156621efed4a83b0a1598b9e193e1ba9bb7c448ebc2a41320d1b53c4756507b6"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.240103 4962 scope.go:117] "RemoveContainer" containerID="93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.242822 4962 generic.go:334] "Generic (PLEG): container finished" podID="ca793428-98ed-4f82-aa57-31d6671d546c" containerID="2fdedd716304d48ca972e72c6c0a4e94560cd57ce8c5b0409e88600b50604c0b" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.242938 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerDied","Data":"2fdedd716304d48ca972e72c6c0a4e94560cd57ce8c5b0409e88600b50604c0b"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.269253 4962 generic.go:334] "Generic (PLEG): container finished" podID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerID="7c19f6ab819e8b088592bd7831817812900bca1c0cc3649a9662bfcc1aa1ae48" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.269332 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerDied","Data":"7c19f6ab819e8b088592bd7831817812900bca1c0cc3649a9662bfcc1aa1ae48"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.280231 4962 generic.go:334] "Generic (PLEG): container finished" podID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerID="fd4f315997ddf00a356a9ec5e5c2864b8fa25408200a3b8ba03172b2cebc87ed" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.280307 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerDied","Data":"fd4f315997ddf00a356a9ec5e5c2864b8fa25408200a3b8ba03172b2cebc87ed"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.288060 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.288884 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.302582 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerDied","Data":"a34d63171eeb032e506f3c3f6390187d10864d694aff1bd3157c782304896d3f"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.302785 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.308859 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data" (OuterVolumeSpecName: "config-data") pod "7f35bada-015d-4051-9976-d5dfe3a93216" (UID: "7f35bada-015d-4051-9976-d5dfe3a93216"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.328655 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.328689 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.328699 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.330084 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.335799 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00","Type":"ContainerDied","Data":"33d56931989951f09c69fe90d6e65d85c8e97ea86a78f0d42f65def6270a08a7"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.336003 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.368650 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.368880 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerDied","Data":"b094901d04b6844ae7ff61500f6dbd375cab8bf6c8a00346003f62e1a980cada"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.369002 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.372772 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.375084 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-84464996cb-fhnvz" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:58282->10.217.0.164:9311: read: connection reset by peer" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.375196 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-84464996cb-fhnvz" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:58270->10.217.0.164:9311: read: connection reset by peer" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.376454 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.380300 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.392504 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerDied","Data":"0bed354fd9a98e89b5d38e5675524156eb0b61c69b251716c3b22a1d0bef6443"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.392635 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.397182 4962 generic.go:334] "Generic (PLEG): container finished" podID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerID="eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.397252 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerDied","Data":"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.397273 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerDied","Data":"25d2258a03970a75594e5384f741d4a8aaad9e37d3b0b7c512e80fa795dc3283"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.397326 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.413373 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerDied","Data":"d0a92b505f163c98c2579b38133407e2587dcd82e4a7d6302d1e3ca2e2112d68"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.413489 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.427547 4962 scope.go:117] "RemoveContainer" containerID="d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.428498 4962 generic.go:334] "Generic (PLEG): container finished" podID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerID="42f878706ae1a7e2114a67d56b43328ffc07645b6f77f8f9d20b6c4a2aec6632" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.428523 4962 generic.go:334] "Generic (PLEG): container finished" podID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerID="68a406d2a6eadc4116c120af687c887ef22a20b066ec54d2d6991bd97aaef0e9" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.428565 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerDied","Data":"42f878706ae1a7e2114a67d56b43328ffc07645b6f77f8f9d20b6c4a2aec6632"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.428611 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerDied","Data":"68a406d2a6eadc4116c120af687c887ef22a20b066ec54d2d6991bd97aaef0e9"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.430499 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.430524 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.431931 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c46d-account-create-update-gfqts" event={"ID":"cca18a27-31bc-440b-a4a9-517b3323bb91","Type":"ContainerDied","Data":"f6015af78b401355bf39302e0c5756af3b69a15cfa67686aab9f59e8e5466d2c"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.431990 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.532639 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.549812 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.555907 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.557897 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.569063 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.581715 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.586922 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.586983 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerName="nova-cell1-conductor-conductor" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.587226 4962 scope.go:117] "RemoveContainer" containerID="1a117c325a572e0a4fee70e6f72cca84b0d93bdf09ce042ac50994ca64fd3520" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.590974 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.630686 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.635470 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.635544 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsclb\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.635701 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.635857 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.635977 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.636010 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.636070 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.636662 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.636916 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.638518 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.639008 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.640019 4962 scope.go:117] "RemoveContainer" containerID="aa9b7812a81805d3c1d048e75378c2e89e7f075bbe36af5665b4416075da7b83" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.654364 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.674420 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.682549 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.716847 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb" (OuterVolumeSpecName: "kube-api-access-xsclb") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "kube-api-access-xsclb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.723902 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.739971 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.742516 4962 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.742541 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.742554 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsclb\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.753783 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.758264 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.771866 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.776784 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.825538 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data" (OuterVolumeSpecName: "config-data") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.827331 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.832461 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.845853 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.846727 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.846759 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.846771 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.846781 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.857320 4962 scope.go:117] "RemoveContainer" containerID="c0eb68155798173ab5bc0e3d87fda35f3734305779104c33299016d17b9b3def" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.870517 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.886990 4962 scope.go:117] "RemoveContainer" containerID="f20b981aacdf6de658de3f762f39158362f94f8752f0a75fc0ae9dfa445ad0b1" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.918941 4962 scope.go:117] "RemoveContainer" containerID="5debc339fcb891cc07e7fa0a7db99fb7f297c28473a143743938f4792107d27c" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.947927 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948002 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948046 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs\") pod \"ca793428-98ed-4f82-aa57-31d6671d546c\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data\") pod \"ca793428-98ed-4f82-aa57-31d6671d546c\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948101 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948141 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948213 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948259 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlgvc\" (UniqueName: \"kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc\") pod \"ca793428-98ed-4f82-aa57-31d6671d546c\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948289 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle\") pod \"ca793428-98ed-4f82-aa57-31d6671d546c\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948317 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs\") pod \"ca793428-98ed-4f82-aa57-31d6671d546c\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948343 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvdkr\" (UniqueName: \"kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948477 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948491 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.950498 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs" (OuterVolumeSpecName: "logs") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.951945 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs" (OuterVolumeSpecName: "logs") pod "ca793428-98ed-4f82-aa57-31d6671d546c" (UID: "ca793428-98ed-4f82-aa57-31d6671d546c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.953098 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.961209 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr" (OuterVolumeSpecName: "kube-api-access-gvdkr") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "kube-api-access-gvdkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.978010 4962 scope.go:117] "RemoveContainer" containerID="6cbbafaf6ad06d0f58cf79b2da64a294b16c2b2e6931344860d8ecda539fe7b2" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.987145 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts" (OuterVolumeSpecName: "scripts") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.041328 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc" (OuterVolumeSpecName: "kube-api-access-hlgvc") pod "ca793428-98ed-4f82-aa57-31d6671d546c" (UID: "ca793428-98ed-4f82-aa57-31d6671d546c"). InnerVolumeSpecName "kube-api-access-hlgvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.047747 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.047880 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053133 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053165 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053175 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053184 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053193 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053201 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlgvc\" (UniqueName: \"kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053209 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053218 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvdkr\" (UniqueName: \"kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.082879 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.083447 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-central-agent" containerID="cri-o://cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.083879 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="proxy-httpd" containerID="cri-o://ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.083991 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="sg-core" containerID="cri-o://6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.084084 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-notification-agent" containerID="cri-o://ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.088711 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data" (OuterVolumeSpecName: "config-data") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.156318 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.181796 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.182044 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" containerName="kube-state-metrics" containerID="cri-o://490c8746de0bc6e3f4ef0520b2658d4424532e972e69bd55a421dfcd9ed32cf4" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.188756 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.227770 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.228351 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca793428-98ed-4f82-aa57-31d6671d546c" (UID: "ca793428-98ed-4f82-aa57-31d6671d546c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.240111 4962 scope.go:117] "RemoveContainer" containerID="eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.256535 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.256632 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerName="nova-cell0-conductor-conductor" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.258027 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.348494 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ca793428-98ed-4f82-aa57-31d6671d546c" (UID: "ca793428-98ed-4f82-aa57-31d6671d546c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.352025 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.360172 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.361335 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.365217 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.367779 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data" (OuterVolumeSpecName: "config-data") pod "ca793428-98ed-4f82-aa57-31d6671d546c" (UID: "ca793428-98ed-4f82-aa57-31d6671d546c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.432861 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476449 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqv6c\" (UniqueName: \"kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476504 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476601 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476633 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476677 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476770 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476862 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476911 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.477306 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.477316 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.481351 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.482030 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs" (OuterVolumeSpecName: "logs") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.502308 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.502541 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="b22a9e86-ccdf-4505-8116-21b0230943fc" containerName="memcached" containerID="cri-o://2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.503241 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.518986 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c" (OuterVolumeSpecName: "kube-api-access-vqv6c") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "kube-api-access-vqv6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.519255 4962 scope.go:117] "RemoveContainer" containerID="4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.540688 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts" (OuterVolumeSpecName: "scripts") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.550730 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerDied","Data":"cc47509aa1ca6c26cc469e518128ac8e3dbaf917ad6f17beac89df46710d9f73"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.550873 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.581841 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.581869 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.581932 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.581942 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqv6c\" (UniqueName: \"kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.581951 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.621402 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.649303 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.657495 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-125a-account-create-update-bd2q8"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.683933 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.684323 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.698550 4962 generic.go:334] "Generic (PLEG): container finished" podID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerID="42f33c3ac4e84257c4f38d060186abe1300d7dfb20f8894c1b519bb38d1529c9" exitCode=0 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.698691 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerDied","Data":"42f33c3ac4e84257c4f38d060186abe1300d7dfb20f8894c1b519bb38d1529c9"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.698727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerDied","Data":"6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.698740 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.724815 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.724850 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerDied","Data":"6493293a11e7a20494076438c227d47d6ea680b9e8bbd314969ad609945e742d"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.734567 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-125a-account-create-update-bd2q8"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.764264 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-125a-account-create-update-rtszm"] Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765185 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="probe" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765205 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="probe" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765224 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765257 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765270 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-api" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765279 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-api" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765286 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c21489-524e-4ee7-a340-5be2573af161" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765329 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c21489-524e-4ee7-a340-5be2573af161" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765339 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765346 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765354 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="cinder-scheduler" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765361 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="cinder-scheduler" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765369 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765376 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765405 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-server" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765413 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-server" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765422 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765428 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765439 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765447 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765484 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765491 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765506 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="ovsdbserver-sb" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765512 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="ovsdbserver-sb" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765544 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765551 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765563 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765569 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765576 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="ovsdbserver-nb" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765582 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="ovsdbserver-nb" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765674 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765683 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765694 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765700 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765735 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765742 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766102 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="mysql-bootstrap" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766113 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="mysql-bootstrap" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766124 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="init" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766131 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="init" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766142 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766158 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766167 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766173 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766190 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="galera" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766196 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="galera" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766204 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766210 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766220 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766227 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766239 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766246 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766255 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="dnsmasq-dns" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766261 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="dnsmasq-dns" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766430 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766444 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c21489-524e-4ee7-a340-5be2573af161" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766454 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="galera" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766464 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766476 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766488 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766496 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766505 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766515 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766524 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="ovsdbserver-nb" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766534 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-server" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766545 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="dnsmasq-dns" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766555 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766564 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766573 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766583 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766607 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766615 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766625 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="cinder-scheduler" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766635 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766641 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="ovsdbserver-sb" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766652 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-api" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766659 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766666 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="probe" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766677 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.767495 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.774815 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.790612 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.795835 4962 generic.go:334] "Generic (PLEG): container finished" podID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" containerID="490c8746de0bc6e3f4ef0520b2658d4424532e972e69bd55a421dfcd9ed32cf4" exitCode=2 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.796204 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cffca43e-3e19-4430-8fe2-ca7cfe6229b0","Type":"ContainerDied","Data":"490c8746de0bc6e3f4ef0520b2658d4424532e972e69bd55a421dfcd9ed32cf4"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.796421 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.797487 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.818210 4962 scope.go:117] "RemoveContainer" containerID="eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.819255 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data" (OuterVolumeSpecName: "config-data") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.819307 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-m26vd"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.829729 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-125a-account-create-update-rtszm"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.839176 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-r4hdf"] Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.851798 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548\": container with ID starting with eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548 not found: ID does not exist" containerID="eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.851842 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548"} err="failed to get container status \"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548\": rpc error: code = NotFound desc = could not find container \"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548\": container with ID starting with eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548 not found: ID does not exist" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.851871 4962 scope.go:117] "RemoveContainer" containerID="4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.851963 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-m26vd"] Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.853793 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53\": container with ID starting with 4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53 not found: ID does not exist" containerID="4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.853858 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53"} err="failed to get container status \"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53\": rpc error: code = NotFound desc = could not find container \"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53\": container with ID starting with 4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53 not found: ID does not exist" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.853894 4962 scope.go:117] "RemoveContainer" containerID="a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.865392 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-r4hdf"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.887276 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.890838 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892074 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892674 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892748 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892849 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892917 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892957 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qmjz\" (UniqueName: \"kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.893084 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.893415 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.893565 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgjrp\" (UniqueName: \"kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.896113 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerDied","Data":"a7c3f6bf061e2f58df1199abfaabc0fa7edc0079e61af3f51614ef7b77cc0b31"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.896170 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.903911 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs" (OuterVolumeSpecName: "logs") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.915573 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.915631 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.916883 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz" (OuterVolumeSpecName: "kube-api-access-2qmjz") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "kube-api-access-2qmjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.934216 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.935183 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.945647 4962 generic.go:334] "Generic (PLEG): container finished" podID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerID="f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f" exitCode=0 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.946080 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerDied","Data":"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.946137 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerDied","Data":"256cfc6edb7fdfbe31dd4d739c6bcf21323de33dda20f71407beaea0eb6fd7bc"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.952041 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.952299 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-6b4c54c5d9-pqd8r" podUID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" containerName="keystone-api" containerID="cri-o://57e3b54a0aaa3e8886ac13c31c98adf640a3207944f14271a7e3dbd0e513db14" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.967045 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-svsfg"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.975355 4962 generic.go:334] "Generic (PLEG): container finished" podID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerID="2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" exitCode=0 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.975892 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ce62af15-166f-4f74-a244-2de5147a4b2f","Type":"ContainerDied","Data":"2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.994293 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.006961 4962 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-6f6vb" secret="" err="secret \"galera-openstack-dockercfg-898r2\" not found" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.007017 4962 scope.go:117] "RemoveContainer" containerID="156621efed4a83b0a1598b9e193e1ba9bb7c448ebc2a41320d1b53c4756507b6" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.007320 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-6f6vb_openstack(812fea74-e4e5-4550-8a20-8fe04752a016)\"" pod="openstack/root-account-create-update-6f6vb" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.008711 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-svsfg"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.012528 4962 scope.go:117] "RemoveContainer" containerID="f6e6a97dcf3e2888aaf774e41bb7caae5d9537602046e6592fd534041d6392a2" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.018787 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.018901 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.018958 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.018982 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019053 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019097 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019232 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019290 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019334 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw2np\" (UniqueName: \"kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019360 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019392 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019416 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019446 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019478 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019500 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019799 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgjrp\" (UniqueName: \"kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019858 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019963 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019983 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qmjz\" (UniqueName: \"kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.020052 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.020110 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:17.52009344 +0000 UTC m=+1389.102565276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.023136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.023493 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerDied","Data":"814b8f37484da31723bf086a4604103ef52cd7ea4f8156d43acda95faab765f4"} Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.023628 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.028832 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np" (OuterVolumeSpecName: "kube-api-access-zw2np") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "kube-api-access-zw2np". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.029308 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.032851 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.033735 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data" (OuterVolumeSpecName: "config-data") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.041194 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs" (OuterVolumeSpecName: "logs") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.041299 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-125a-account-create-update-rtszm"] Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.042122 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-zgjrp operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-125a-account-create-update-rtszm" podUID="0991ff2f-16e5-4891-a38d-8cb9e4b016ec" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.071260 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g" (OuterVolumeSpecName: "kube-api-access-mrb2g") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "kube-api-access-mrb2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.071366 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs" (OuterVolumeSpecName: "logs") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.074169 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.084290 4962 projected.go:194] Error preparing data for projected volume kube-api-access-zgjrp for pod openstack/keystone-125a-account-create-update-rtszm: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.084387 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:17.584360587 +0000 UTC m=+1389.166832433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zgjrp" (UniqueName: "kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.085045 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts" (OuterVolumeSpecName: "scripts") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.094371 4962 generic.go:334] "Generic (PLEG): container finished" podID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerID="e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515" exitCode=0 Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.094451 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerDied","Data":"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515"} Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.094486 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerDied","Data":"3a9d85e1ad92d2243530d4e2efdb0f3c712197cf6ab61af23aeb5feca6269a13"} Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.094633 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.104098 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.127670 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128615 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128645 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128656 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw2np\" (UniqueName: \"kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128666 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128680 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128688 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128698 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128707 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128719 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128730 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128738 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128770 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.129464 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.129535 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts podName:812fea74-e4e5-4550-8a20-8fe04752a016 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:17.629509632 +0000 UTC m=+1389.211981478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts") pod "root-account-create-update-6f6vb" (UID: "812fea74-e4e5-4550-8a20-8fe04752a016") : configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.185758 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.200239 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" path="/var/lib/kubelet/pods/28437fcd-377a-4b9e-9a28-e01c21e2ad1f/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.207988 4962 scope.go:117] "RemoveContainer" containerID="42f878706ae1a7e2114a67d56b43328ffc07645b6f77f8f9d20b6c4a2aec6632" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.208845 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.230326 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" path="/var/lib/kubelet/pods/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.233347 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37eccece-549c-4b2f-b066-481b216d7ece" path="/var/lib/kubelet/pods/37eccece-549c-4b2f-b066-481b216d7ece/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.235097 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" path="/var/lib/kubelet/pods/4a879cb3-19b4-4767-8640-993cc47dc7ed/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.237452 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" path="/var/lib/kubelet/pods/559addbd-1bc6-4146-9a27-ce3e1d3d08fd/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.240344 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.241752 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.241371 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" path="/var/lib/kubelet/pods/7f35bada-015d-4051-9976-d5dfe3a93216/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.243158 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" path="/var/lib/kubelet/pods/8dd889b7-1b72-4e57-ad0f-85facbad8da4/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.249087 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c97128d-8360-482e-b05b-6025d046c122" path="/var/lib/kubelet/pods/9c97128d-8360-482e-b05b-6025d046c122/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.257237 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3d903f3-8f86-49e2-848b-4a59a9068b75" path="/var/lib/kubelet/pods/a3d903f3-8f86-49e2-848b-4a59a9068b75/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.260821 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" path="/var/lib/kubelet/pods/c90d5126-d89a-42e6-9b7d-bfc53475bc56/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.262099 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.269012 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="galera" containerID="cri-o://0ee4c6895eaf367e01ee1ab962d5fa0868b6b165760c399d39cc5c1615f1960b" gracePeriod=30 Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.277258 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.277617 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca18a27-31bc-440b-a4a9-517b3323bb91" path="/var/lib/kubelet/pods/cca18a27-31bc-440b-a4a9-517b3323bb91/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.282065 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" path="/var/lib/kubelet/pods/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.287241 4962 scope.go:117] "RemoveContainer" containerID="68a406d2a6eadc4116c120af687c887ef22a20b066ec54d2d6991bd97aaef0e9" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.287466 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data" (OuterVolumeSpecName: "config-data") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.291886 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.291933 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.291953 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.302074 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.304975 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data" (OuterVolumeSpecName: "config-data") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.323822 4962 scope.go:117] "RemoveContainer" containerID="7c19f6ab819e8b088592bd7831817812900bca1c0cc3649a9662bfcc1aa1ae48" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.341199 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.343562 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle\") pod \"ce62af15-166f-4f74-a244-2de5147a4b2f\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.343649 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data\") pod \"ce62af15-166f-4f74-a244-2de5147a4b2f\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.343708 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px4mp\" (UniqueName: \"kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp\") pod \"ce62af15-166f-4f74-a244-2de5147a4b2f\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.344101 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.344122 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.344134 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.344144 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.347789 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp" (OuterVolumeSpecName: "kube-api-access-px4mp") pod "ce62af15-166f-4f74-a244-2de5147a4b2f" (UID: "ce62af15-166f-4f74-a244-2de5147a4b2f"). InnerVolumeSpecName "kube-api-access-px4mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.358415 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.379221 4962 scope.go:117] "RemoveContainer" containerID="aa52f40e409ac825205d183f70f7cf56df81e106f777a2fe46a3166fb938361b" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.379385 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.386112 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.391826 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce62af15-166f-4f74-a244-2de5147a4b2f" (UID: "ce62af15-166f-4f74-a244-2de5147a4b2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.397077 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data" (OuterVolumeSpecName: "config-data") pod "ce62af15-166f-4f74-a244-2de5147a4b2f" (UID: "ce62af15-166f-4f74-a244-2de5147a4b2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.401874 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.409240 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.442169 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.445901 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.452675 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.452753 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.453353 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7dpm\" (UniqueName: \"kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm\") pod \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.453416 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle\") pod \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.454270 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config\") pod \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.454527 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs\") pod \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.455050 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.455084 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.455095 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.455105 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px4mp\" (UniqueName: \"kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.458405 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm" (OuterVolumeSpecName: "kube-api-access-s7dpm") pod "cffca43e-3e19-4430-8fe2-ca7cfe6229b0" (UID: "cffca43e-3e19-4430-8fe2-ca7cfe6229b0"). InnerVolumeSpecName "kube-api-access-s7dpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.481056 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "cffca43e-3e19-4430-8fe2-ca7cfe6229b0" (UID: "cffca43e-3e19-4430-8fe2-ca7cfe6229b0"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.488451 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cffca43e-3e19-4430-8fe2-ca7cfe6229b0" (UID: "cffca43e-3e19-4430-8fe2-ca7cfe6229b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.513756 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "cffca43e-3e19-4430-8fe2-ca7cfe6229b0" (UID: "cffca43e-3e19-4430-8fe2-ca7cfe6229b0"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.556663 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.556847 4962 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.556860 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7dpm\" (UniqueName: \"kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.556872 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.556881 4962 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.556963 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.557024 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data podName:56a77dd3-ef10-46a6-a00d-ab38af0d4338 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:25.557004006 +0000 UTC m=+1397.139475852 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data") pod "rabbitmq-cell1-server-0" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338") : configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.557372 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.557394 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:18.557387728 +0000 UTC m=+1390.139859574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.629691 4962 scope.go:117] "RemoveContainer" containerID="fd4f315997ddf00a356a9ec5e5c2864b8fa25408200a3b8ba03172b2cebc87ed" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.658195 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgjrp\" (UniqueName: \"kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.658676 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.658804 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts podName:812fea74-e4e5-4550-8a20-8fe04752a016 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:18.658776882 +0000 UTC m=+1390.241248728 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts") pod "root-account-create-update-6f6vb" (UID: "812fea74-e4e5-4550-8a20-8fe04752a016") : configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.669793 4962 projected.go:194] Error preparing data for projected volume kube-api-access-zgjrp for pod openstack/keystone-125a-account-create-update-rtszm: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.669940 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:18.669902716 +0000 UTC m=+1390.252374572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zgjrp" (UniqueName: "kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.670682 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.675666 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.722177 4962 scope.go:117] "RemoveContainer" containerID="2063db6c0681c99c5af22bd280759565fe6f153460080e6e822a7af9e9e7ff12" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.786837 4962 scope.go:117] "RemoveContainer" containerID="f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.825729 4962 scope.go:117] "RemoveContainer" containerID="c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.858432 4962 scope.go:117] "RemoveContainer" containerID="f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.871398 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f\": container with ID starting with f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f not found: ID does not exist" containerID="f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.871462 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f"} err="failed to get container status \"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f\": rpc error: code = NotFound desc = could not find container \"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f\": container with ID starting with f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f not found: ID does not exist" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.871498 4962 scope.go:117] "RemoveContainer" containerID="c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.872375 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b\": container with ID starting with c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b not found: ID does not exist" containerID="c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.872501 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b"} err="failed to get container status \"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b\": rpc error: code = NotFound desc = could not find container \"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b\": container with ID starting with c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b not found: ID does not exist" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.872551 4962 scope.go:117] "RemoveContainer" containerID="2fdedd716304d48ca972e72c6c0a4e94560cd57ce8c5b0409e88600b50604c0b" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.895721 4962 scope.go:117] "RemoveContainer" containerID="fdc035dec22a8cb1cbe15ddbb643e583e6ad19e8deec930029ff3031763b1c89" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.927782 4962 scope.go:117] "RemoveContainer" containerID="e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.953841 4962 scope.go:117] "RemoveContainer" containerID="a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.979493 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.984244 4962 scope.go:117] "RemoveContainer" containerID="e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.984729 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515\": container with ID starting with e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515 not found: ID does not exist" containerID="e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.984809 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515"} err="failed to get container status \"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515\": rpc error: code = NotFound desc = could not find container \"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515\": container with ID starting with e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515 not found: ID does not exist" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.984868 4962 scope.go:117] "RemoveContainer" containerID="a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.985266 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2\": container with ID starting with a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2 not found: ID does not exist" containerID="a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.985298 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2"} err="failed to get container status \"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2\": rpc error: code = NotFound desc = could not find container \"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2\": container with ID starting with a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2 not found: ID does not exist" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.066970 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle\") pod \"b22a9e86-ccdf-4505-8116-21b0230943fc\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.067044 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data\") pod \"b22a9e86-ccdf-4505-8116-21b0230943fc\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.067126 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs\") pod \"b22a9e86-ccdf-4505-8116-21b0230943fc\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.067823 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config\") pod \"b22a9e86-ccdf-4505-8116-21b0230943fc\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.067913 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65mll\" (UniqueName: \"kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll\") pod \"b22a9e86-ccdf-4505-8116-21b0230943fc\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.075483 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data" (OuterVolumeSpecName: "config-data") pod "b22a9e86-ccdf-4505-8116-21b0230943fc" (UID: "b22a9e86-ccdf-4505-8116-21b0230943fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.076572 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b22a9e86-ccdf-4505-8116-21b0230943fc" (UID: "b22a9e86-ccdf-4505-8116-21b0230943fc"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.078967 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll" (OuterVolumeSpecName: "kube-api-access-65mll") pod "b22a9e86-ccdf-4505-8116-21b0230943fc" (UID: "b22a9e86-ccdf-4505-8116-21b0230943fc"). InnerVolumeSpecName "kube-api-access-65mll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.087077 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_33d73a04-08b2-4944-861f-749a63c2565d/ovn-northd/0.log" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.087158 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.122441 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b22a9e86-ccdf-4505-8116-21b0230943fc" (UID: "b22a9e86-ccdf-4505-8116-21b0230943fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.136438 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.142645 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "b22a9e86-ccdf-4505-8116-21b0230943fc" (UID: "b22a9e86-ccdf-4505-8116-21b0230943fc"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.151140 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ce62af15-166f-4f74-a244-2de5147a4b2f","Type":"ContainerDied","Data":"4fa1fbefe8085f86ec2949fb3171b5df9f6211664e4db89dbc2b776f71f19d88"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.151211 4962 scope.go:117] "RemoveContainer" containerID="2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.151158 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.163339 4962 generic.go:334] "Generic (PLEG): container finished" podID="b22a9e86-ccdf-4505-8116-21b0230943fc" containerID="2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54" exitCode=0 Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.163450 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b22a9e86-ccdf-4505-8116-21b0230943fc","Type":"ContainerDied","Data":"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.163451 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.163481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b22a9e86-ccdf-4505-8116-21b0230943fc","Type":"ContainerDied","Data":"527bc0b9350edbbd23edfe05a933e12b44f8d4ad0c70495feffaffb9052c4070"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.171797 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.171881 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.171989 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172034 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r87b2\" (UniqueName: \"kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172053 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172125 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172167 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172523 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172542 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172552 4962 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172562 4962 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172574 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65mll\" (UniqueName: \"kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.173023 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts" (OuterVolumeSpecName: "scripts") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.173163 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config" (OuterVolumeSpecName: "config") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.173901 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.180245 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2" (OuterVolumeSpecName: "kube-api-access-r87b2") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "kube-api-access-r87b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.210186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cffca43e-3e19-4430-8fe2-ca7cfe6229b0","Type":"ContainerDied","Data":"2851b19111bcc172daacd941571725296e0313b2b3496256066714262e7d3b9a"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.210313 4962 scope.go:117] "RemoveContainer" containerID="2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.210400 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.217006 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219034 4962 generic.go:334] "Generic (PLEG): container finished" podID="fae69c76-754d-4125-a405-23a3938e90a9" containerID="ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5" exitCode=0 Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219064 4962 generic.go:334] "Generic (PLEG): container finished" podID="fae69c76-754d-4125-a405-23a3938e90a9" containerID="6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e" exitCode=2 Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219072 4962 generic.go:334] "Generic (PLEG): container finished" podID="fae69c76-754d-4125-a405-23a3938e90a9" containerID="cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a" exitCode=0 Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219116 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerDied","Data":"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219144 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerDied","Data":"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219154 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerDied","Data":"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.220857 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_33d73a04-08b2-4944-861f-749a63c2565d/ovn-northd/0.log" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.220889 4962 generic.go:334] "Generic (PLEG): container finished" podID="33d73a04-08b2-4944-861f-749a63c2565d" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" exitCode=139 Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.220928 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerDied","Data":"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.220945 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerDied","Data":"5c36b8026a940c293c08dfca1df88e2b23028519f85595786058a0396a1ade5b"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.221003 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.243094 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.245284 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.261702 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.267818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.269316 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274140 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274173 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274182 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274190 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274200 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r87b2\" (UniqueName: \"kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274211 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.277458 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.282921 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.293779 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.299814 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.304370 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.312824 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.337095 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.348282 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.354691 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.360312 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.375814 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.422352 4962 scope.go:117] "RemoveContainer" containerID="2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54" Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.465836 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54\": container with ID starting with 2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54 not found: ID does not exist" containerID="2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.465890 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54"} err="failed to get container status \"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54\": rpc error: code = NotFound desc = could not find container \"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54\": container with ID starting with 2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54 not found: ID does not exist" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.465923 4962 scope.go:117] "RemoveContainer" containerID="490c8746de0bc6e3f4ef0520b2658d4424532e972e69bd55a421dfcd9ed32cf4" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.537049 4962 scope.go:117] "RemoveContainer" containerID="0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.561874 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.567300 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.588260 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.589203 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.589275 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:20.589255284 +0000 UTC m=+1392.171727130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : configmap "openstack-scripts" not found Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.649984 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.651052 4962 scope.go:117] "RemoveContainer" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.692989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgjrp\" (UniqueName: \"kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.693455 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.693517 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts podName:812fea74-e4e5-4550-8a20-8fe04752a016 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:20.693493886 +0000 UTC m=+1392.275965732 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts") pod "root-account-create-update-6f6vb" (UID: "812fea74-e4e5-4550-8a20-8fe04752a016") : configmap "openstack-scripts" not found Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.696580 4962 projected.go:194] Error preparing data for projected volume kube-api-access-zgjrp for pod openstack/keystone-125a-account-create-update-rtszm: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.697121 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:20.697063187 +0000 UTC m=+1392.279535033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zgjrp" (UniqueName: "kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.713306 4962 scope.go:117] "RemoveContainer" containerID="0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453" Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.716842 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453\": container with ID starting with 0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453 not found: ID does not exist" containerID="0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.716899 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453"} err="failed to get container status \"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453\": rpc error: code = NotFound desc = could not find container \"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453\": container with ID starting with 0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453 not found: ID does not exist" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.716931 4962 scope.go:117] "RemoveContainer" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.717541 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe\": container with ID starting with 095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe not found: ID does not exist" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.717703 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe"} err="failed to get container status \"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe\": rpc error: code = NotFound desc = could not find container \"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe\": container with ID starting with 095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe not found: ID does not exist" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.794356 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts\") pod \"812fea74-e4e5-4550-8a20-8fe04752a016\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.794688 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qvff\" (UniqueName: \"kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff\") pod \"812fea74-e4e5-4550-8a20-8fe04752a016\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.795995 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "812fea74-e4e5-4550-8a20-8fe04752a016" (UID: "812fea74-e4e5-4550-8a20-8fe04752a016"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.801430 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff" (OuterVolumeSpecName: "kube-api-access-5qvff") pod "812fea74-e4e5-4550-8a20-8fe04752a016" (UID: "812fea74-e4e5-4550-8a20-8fe04752a016"). InnerVolumeSpecName "kube-api-access-5qvff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.801538 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896385 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896456 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896491 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896525 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896561 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896609 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896628 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896712 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896781 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896799 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hckp\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896875 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.897109 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.897236 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qvff\" (UniqueName: \"kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.897250 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.897346 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.897772 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.902767 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.902804 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.902839 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.902945 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp" (OuterVolumeSpecName: "kube-api-access-2hckp") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "kube-api-access-2hckp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.903283 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info" (OuterVolumeSpecName: "pod-info") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.927266 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data" (OuterVolumeSpecName: "config-data") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.946737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf" (OuterVolumeSpecName: "server-conf") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.989440 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999344 4962 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999383 4962 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999394 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999429 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999442 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999451 4962 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999459 4962 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999468 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999479 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hckp\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999489 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999500 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.021010 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.101641 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.155334 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" path="/var/lib/kubelet/pods/10c1a487-1a74-4994-9b39-f05cbe0fa5c7/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.156041 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" path="/var/lib/kubelet/pods/241dc417-3176-4051-ad4e-d98f4f66ddc2/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.163068 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33d73a04-08b2-4944-861f-749a63c2565d" path="/var/lib/kubelet/pods/33d73a04-08b2-4944-861f-749a63c2565d/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.164550 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" path="/var/lib/kubelet/pods/4f4a409a-4230-42ca-bfcc-f014064cbc6c/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.166174 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" path="/var/lib/kubelet/pods/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.168433 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b22a9e86-ccdf-4505-8116-21b0230943fc" path="/var/lib/kubelet/pods/b22a9e86-ccdf-4505-8116-21b0230943fc/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.169553 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" path="/var/lib/kubelet/pods/ba9a9d46-9ba9-428c-8864-a8db8bca2b57/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.171915 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" path="/var/lib/kubelet/pods/ca793428-98ed-4f82-aa57-31d6671d546c/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.172900 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" path="/var/lib/kubelet/pods/ce62af15-166f-4f74-a244-2de5147a4b2f/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.173700 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" path="/var/lib/kubelet/pods/cffca43e-3e19-4430-8fe2-ca7cfe6229b0/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.258749 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.259014 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.259372 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.259491 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.259722 4962 generic.go:334] "Generic (PLEG): container finished" podID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerID="89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718" exitCode=0 Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.259790 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.259808 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerDied","Data":"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718"} Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.259847 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerDied","Data":"1af76abb62f306cbfdd579814518b0ea666f529247ac9e64fd984f69498132b5"} Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.259870 4962 scope.go:117] "RemoveContainer" containerID="89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.261954 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.265944 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.269170 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6f6vb" event={"ID":"812fea74-e4e5-4550-8a20-8fe04752a016","Type":"ContainerDied","Data":"b35105a6f1f09300973fb51f5cc2ceed7e4acc42cd81be4a5215ef08b873fcd8"} Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.269193 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.269315 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.269370 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.285897 4962 generic.go:334] "Generic (PLEG): container finished" podID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerID="0ee4c6895eaf367e01ee1ab962d5fa0868b6b165760c399d39cc5c1615f1960b" exitCode=0 Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.286101 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerDied","Data":"0ee4c6895eaf367e01ee1ab962d5fa0868b6b165760c399d39cc5c1615f1960b"} Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.300344 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.306458 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.306522 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data podName:2a8d652d-aea8-4a83-b33e-0d2522af0be8 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:27.306505375 +0000 UTC m=+1398.888977221 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data") pod "rabbitmq-server-0" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8") : configmap "rabbitmq-config-data" not found Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.340207 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.348601 4962 scope.go:117] "RemoveContainer" containerID="565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.350087 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.356197 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.361482 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.374251 4962 scope.go:117] "RemoveContainer" containerID="89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.374678 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718\": container with ID starting with 89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718 not found: ID does not exist" containerID="89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.374709 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718"} err="failed to get container status \"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718\": rpc error: code = NotFound desc = could not find container \"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718\": container with ID starting with 89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718 not found: ID does not exist" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.374733 4962 scope.go:117] "RemoveContainer" containerID="565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.375097 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857\": container with ID starting with 565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857 not found: ID does not exist" containerID="565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.375119 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857"} err="failed to get container status \"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857\": rpc error: code = NotFound desc = could not find container \"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857\": container with ID starting with 565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857 not found: ID does not exist" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.375137 4962 scope.go:117] "RemoveContainer" containerID="156621efed4a83b0a1598b9e193e1ba9bb7c448ebc2a41320d1b53c4756507b6" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.385101 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-125a-account-create-update-rtszm"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.396938 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-125a-account-create-update-rtszm"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.510492 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.510978 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgjrp\" (UniqueName: \"kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.626144 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715133 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715211 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715300 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715333 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715365 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715415 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zkm4\" (UniqueName: \"kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715438 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715469 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.716492 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.717113 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.717471 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.717774 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.723689 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4" (OuterVolumeSpecName: "kube-api-access-4zkm4") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "kube-api-access-4zkm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.733379 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.759160 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.788302 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817487 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817522 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zkm4\" (UniqueName: \"kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817533 4962 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817543 4962 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817572 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817582 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817618 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817629 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.833438 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.919379 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.973901 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.057547 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.124403 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.124892 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.124929 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.124972 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125009 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125043 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125068 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125094 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125158 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125188 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125213 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125354 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125418 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcvhk\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125450 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125483 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125466 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125513 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125533 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvmhh\" (UniqueName: \"kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125557 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125742 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125956 4962 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125968 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.126258 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.126649 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.129010 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.130133 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info" (OuterVolumeSpecName: "pod-info") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.131436 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.131723 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts" (OuterVolumeSpecName: "scripts") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.133307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.134279 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.146474 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk" (OuterVolumeSpecName: "kube-api-access-rcvhk") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "kube-api-access-rcvhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.146615 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh" (OuterVolumeSpecName: "kube-api-access-mvmhh") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "kube-api-access-mvmhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.205209 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.211496 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf" (OuterVolumeSpecName: "server-conf") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.212023 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data" (OuterVolumeSpecName: "config-data") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227827 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227861 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227873 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvmhh\" (UniqueName: \"kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227886 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227894 4962 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227903 4962 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227912 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227920 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227939 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227948 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227956 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227964 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcvhk\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227972 4962 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.265404 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.314265 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.335798 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data" (OuterVolumeSpecName: "config-data") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.340054 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: W0220 10:18:20.340284 4962 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/fae69c76-754d-4125-a405-23a3938e90a9/volumes/kubernetes.io~secret/config-data Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.340348 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data" (OuterVolumeSpecName: "config-data") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.341625 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.341647 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.341663 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.347649 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerDied","Data":"9629cf6fabd95f146380c31c7bc910c7de73918acc62bb7e7fbe72c4774cfa18"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.347724 4962 scope.go:117] "RemoveContainer" containerID="0ee4c6895eaf367e01ee1ab962d5fa0868b6b165760c399d39cc5c1615f1960b" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.347930 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.369235 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.403310 4962 generic.go:334] "Generic (PLEG): container finished" podID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerID="5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" exitCode=0 Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.403727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"815f0ef8-a30a-4467-bb56-ff8499a4be44","Type":"ContainerDied","Data":"5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.410808 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.423057 4962 generic.go:334] "Generic (PLEG): container finished" podID="fae69c76-754d-4125-a405-23a3938e90a9" containerID="ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917" exitCode=0 Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.423122 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerDied","Data":"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.423155 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerDied","Data":"498a8615ffc0d02ef23136be3c7f8346a8aa655c4297248e31b8a2413028fcd9"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.423468 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.443918 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.443949 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.447273 4962 generic.go:334] "Generic (PLEG): container finished" podID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" exitCode=0 Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.447417 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcd02115-2eb9-4090-8225-108c3a8cad20","Type":"ContainerDied","Data":"24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.450248 4962 generic.go:334] "Generic (PLEG): container finished" podID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" containerID="57e3b54a0aaa3e8886ac13c31c98adf640a3207944f14271a7e3dbd0e513db14" exitCode=0 Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.450314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b4c54c5d9-pqd8r" event={"ID":"d203fc44-5252-4dd2-98ae-66f9c139b5f5","Type":"ContainerDied","Data":"57e3b54a0aaa3e8886ac13c31c98adf640a3207944f14271a7e3dbd0e513db14"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.452580 4962 generic.go:334] "Generic (PLEG): container finished" podID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerID="f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308" exitCode=0 Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.452631 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerDied","Data":"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.452648 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerDied","Data":"b402e19dca07d8ba27eec1161345a129c0a3f56fa63c23ac6f8b1e82180c9e7c"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.452732 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.506434 4962 scope.go:117] "RemoveContainer" containerID="5738934c1190f3f4ebf6be3609b1f56189c1c53ad8ccc9348121e92913c3ec72" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.547543 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.571876 4962 scope.go:117] "RemoveContainer" containerID="ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.590289 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.605185 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.610474 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.615030 4962 scope.go:117] "RemoveContainer" containerID="6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.634721 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.643277 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.647206 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.647650 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle\") pod \"dcd02115-2eb9-4090-8225-108c3a8cad20\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.647703 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data\") pod \"dcd02115-2eb9-4090-8225-108c3a8cad20\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.647776 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvxk6\" (UniqueName: \"kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6\") pod \"dcd02115-2eb9-4090-8225-108c3a8cad20\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.655876 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6" (OuterVolumeSpecName: "kube-api-access-qvxk6") pod "dcd02115-2eb9-4090-8225-108c3a8cad20" (UID: "dcd02115-2eb9-4090-8225-108c3a8cad20"). InnerVolumeSpecName "kube-api-access-qvxk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.663191 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.665907 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.667910 4962 scope.go:117] "RemoveContainer" containerID="ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.685821 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data" (OuterVolumeSpecName: "config-data") pod "dcd02115-2eb9-4090-8225-108c3a8cad20" (UID: "dcd02115-2eb9-4090-8225-108c3a8cad20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.686349 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcd02115-2eb9-4090-8225-108c3a8cad20" (UID: "dcd02115-2eb9-4090-8225-108c3a8cad20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.712041 4962 scope.go:117] "RemoveContainer" containerID="cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.737365 4962 scope.go:117] "RemoveContainer" containerID="ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.737945 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5\": container with ID starting with ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5 not found: ID does not exist" containerID="ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.738019 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5"} err="failed to get container status \"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5\": rpc error: code = NotFound desc = could not find container \"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5\": container with ID starting with ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5 not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.738066 4962 scope.go:117] "RemoveContainer" containerID="6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.738477 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e\": container with ID starting with 6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e not found: ID does not exist" containerID="6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.738523 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e"} err="failed to get container status \"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e\": rpc error: code = NotFound desc = could not find container \"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e\": container with ID starting with 6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.738558 4962 scope.go:117] "RemoveContainer" containerID="ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.738961 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917\": container with ID starting with ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917 not found: ID does not exist" containerID="ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.738990 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917"} err="failed to get container status \"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917\": rpc error: code = NotFound desc = could not find container \"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917\": container with ID starting with ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917 not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.739003 4962 scope.go:117] "RemoveContainer" containerID="cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.739389 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a\": container with ID starting with cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a not found: ID does not exist" containerID="cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.739411 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a"} err="failed to get container status \"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a\": rpc error: code = NotFound desc = could not find container \"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a\": container with ID starting with cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.739428 4962 scope.go:117] "RemoveContainer" containerID="f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750039 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750096 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle\") pod \"815f0ef8-a30a-4467-bb56-ff8499a4be44\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750147 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data\") pod \"815f0ef8-a30a-4467-bb56-ff8499a4be44\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750215 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750258 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750289 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hql7h\" (UniqueName: \"kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h\") pod \"815f0ef8-a30a-4467-bb56-ff8499a4be44\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750348 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750377 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750540 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750610 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzvw9\" (UniqueName: \"kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750906 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750924 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750934 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvxk6\" (UniqueName: \"kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.756113 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.756168 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9" (OuterVolumeSpecName: "kube-api-access-gzvw9") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "kube-api-access-gzvw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.756182 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts" (OuterVolumeSpecName: "scripts") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.756208 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h" (OuterVolumeSpecName: "kube-api-access-hql7h") pod "815f0ef8-a30a-4467-bb56-ff8499a4be44" (UID: "815f0ef8-a30a-4467-bb56-ff8499a4be44"). InnerVolumeSpecName "kube-api-access-hql7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.756234 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.765209 4962 scope.go:117] "RemoveContainer" containerID="1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.772450 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "815f0ef8-a30a-4467-bb56-ff8499a4be44" (UID: "815f0ef8-a30a-4467-bb56-ff8499a4be44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.773072 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data" (OuterVolumeSpecName: "config-data") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.778349 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data" (OuterVolumeSpecName: "config-data") pod "815f0ef8-a30a-4467-bb56-ff8499a4be44" (UID: "815f0ef8-a30a-4467-bb56-ff8499a4be44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.783907 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.787925 4962 scope.go:117] "RemoveContainer" containerID="f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.788502 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308\": container with ID starting with f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308 not found: ID does not exist" containerID="f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.788547 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308"} err="failed to get container status \"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308\": rpc error: code = NotFound desc = could not find container \"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308\": container with ID starting with f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308 not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.788606 4962 scope.go:117] "RemoveContainer" containerID="1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.789050 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e\": container with ID starting with 1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e not found: ID does not exist" containerID="1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.789079 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e"} err="failed to get container status \"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e\": rpc error: code = NotFound desc = could not find container \"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e\": container with ID starting with 1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.795820 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.802497 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853128 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853245 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853298 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hql7h\" (UniqueName: \"kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853310 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853320 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853328 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853339 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzvw9\" (UniqueName: \"kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853349 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853360 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853368 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853378 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.161445 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0991ff2f-16e5-4891-a38d-8cb9e4b016ec" path="/var/lib/kubelet/pods/0991ff2f-16e5-4891-a38d-8cb9e4b016ec/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.162544 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" path="/var/lib/kubelet/pods/2a8d652d-aea8-4a83-b33e-0d2522af0be8/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.164285 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" path="/var/lib/kubelet/pods/56a77dd3-ef10-46a6-a00d-ab38af0d4338/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.166917 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" path="/var/lib/kubelet/pods/6e766bfd-869d-43ca-bf11-cf4ec9fa253a/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.168495 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" path="/var/lib/kubelet/pods/812fea74-e4e5-4550-8a20-8fe04752a016/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.169964 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae69c76-754d-4125-a405-23a3938e90a9" path="/var/lib/kubelet/pods/fae69c76-754d-4125-a405-23a3938e90a9/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.462202 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b4c54c5d9-pqd8r" event={"ID":"d203fc44-5252-4dd2-98ae-66f9c139b5f5","Type":"ContainerDied","Data":"cf5b12fd788026ff0304070e27b4ebd505b31b0d0e831a4ccd6e51bd8bb0b383"} Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.462469 4962 scope.go:117] "RemoveContainer" containerID="57e3b54a0aaa3e8886ac13c31c98adf640a3207944f14271a7e3dbd0e513db14" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.462222 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.474528 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"815f0ef8-a30a-4467-bb56-ff8499a4be44","Type":"ContainerDied","Data":"5925bb54309b7a0a7036656c54ac3f8deef63680ce4f7825beb5965502489453"} Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.474627 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.478989 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcd02115-2eb9-4090-8225-108c3a8cad20","Type":"ContainerDied","Data":"a6be5e2b469a4dd84e09bc3f569eccb10479b9448520269901b4d42cca661dde"} Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.479082 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.495126 4962 scope.go:117] "RemoveContainer" containerID="5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.508391 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.524379 4962 scope.go:117] "RemoveContainer" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.529790 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.554666 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.569653 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.576668 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.579084 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:18:23 crc kubenswrapper[4962]: I0220 10:18:23.178902 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" path="/var/lib/kubelet/pods/815f0ef8-a30a-4467-bb56-ff8499a4be44/volumes" Feb 20 10:18:23 crc kubenswrapper[4962]: I0220 10:18:23.182938 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" path="/var/lib/kubelet/pods/d203fc44-5252-4dd2-98ae-66f9c139b5f5/volumes" Feb 20 10:18:23 crc kubenswrapper[4962]: I0220 10:18:23.184437 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" path="/var/lib/kubelet/pods/dcd02115-2eb9-4090-8225-108c3a8cad20/volumes" Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.257916 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.260081 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.260272 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.262066 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.262153 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.266159 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.268085 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.268138 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.258277 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.259572 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.260069 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.260173 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.262424 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.268881 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.276743 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.276818 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:31 crc kubenswrapper[4962]: I0220 10:18:31.599705 4962 generic.go:334] "Generic (PLEG): container finished" podID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerID="45e715a9f15469232fd9eda659480065c452b6d474e0d50459f16eb16fcf18e3" exitCode=0 Feb 20 10:18:31 crc kubenswrapper[4962]: I0220 10:18:31.599822 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerDied","Data":"45e715a9f15469232fd9eda659480065c452b6d474e0d50459f16eb16fcf18e3"} Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.162580 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301259 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301300 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301328 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kdsn\" (UniqueName: \"kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301360 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301390 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301474 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301512 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.316771 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.317359 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn" (OuterVolumeSpecName: "kube-api-access-6kdsn") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "kube-api-access-6kdsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.342173 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config" (OuterVolumeSpecName: "config") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.342468 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.354314 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.375747 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.385819 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403432 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403462 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403472 4962 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403481 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kdsn\" (UniqueName: \"kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403491 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403500 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403508 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.617526 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerDied","Data":"4d1717c6f2d95b6886c02fd175b76f1f1a5915a4672d75c1da15401a0d992411"} Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.617652 4962 scope.go:117] "RemoveContainer" containerID="731c2e1dae94781e12c80ac05ffd0b3634529739ec574c2b3459d53ff4dd175f" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.617668 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.713082 4962 scope.go:117] "RemoveContainer" containerID="45e715a9f15469232fd9eda659480065c452b6d474e0d50459f16eb16fcf18e3" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.717119 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.727100 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:18:33 crc kubenswrapper[4962]: I0220 10:18:33.169681 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" path="/var/lib/kubelet/pods/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3/volumes" Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.258010 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.259299 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.260120 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.260202 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.260431 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.262427 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.264713 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.264768 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.257997 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.259091 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.260004 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.260054 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.261094 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.263495 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.265840 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.265959 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:40 crc kubenswrapper[4962]: I0220 10:18:40.739965 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r7g9h_8e8425d5-32be-4726-915a-3de5c70f0f62/ovs-vswitchd/0.log" Feb 20 10:18:40 crc kubenswrapper[4962]: I0220 10:18:40.742322 4962 generic.go:334] "Generic (PLEG): container finished" podID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" exitCode=137 Feb 20 10:18:40 crc kubenswrapper[4962]: I0220 10:18:40.742370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerDied","Data":"fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38"} Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.125832 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r7g9h_8e8425d5-32be-4726-915a-3de5c70f0f62/ovs-vswitchd/0.log" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.127274 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274227 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274353 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274391 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274439 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274486 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z44jc\" (UniqueName: \"kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274459 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log" (OuterVolumeSpecName: "var-log") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274546 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274586 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274681 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run" (OuterVolumeSpecName: "var-run") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274812 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib" (OuterVolumeSpecName: "var-lib") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.275003 4962 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.275021 4962 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.275030 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.275041 4962 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.276428 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts" (OuterVolumeSpecName: "scripts") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.283498 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc" (OuterVolumeSpecName: "kube-api-access-z44jc") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "kube-api-access-z44jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.383046 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.383106 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z44jc\" (UniqueName: \"kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.765244 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r7g9h_8e8425d5-32be-4726-915a-3de5c70f0f62/ovs-vswitchd/0.log" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.767094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerDied","Data":"b4d03ac8272f687d64246b8c3c40efcac57552a3657ef2ee1db4c3625f47035c"} Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.767185 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.767193 4962 scope.go:117] "RemoveContainer" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.778957 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="63c4d35ae203bd5ac342fa6d490352730d135f847a680bbe15aae0fe53059141" exitCode=137 Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.779024 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"63c4d35ae203bd5ac342fa6d490352730d135f847a680bbe15aae0fe53059141"} Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.800753 4962 scope.go:117] "RemoveContainer" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.833104 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.842222 4962 scope.go:117] "RemoveContainer" containerID="b6626b3616a8427737e8c790adcc57ad3f4d0385df8b472ffc49fd4bd021b003" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.849582 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.988853 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.092903 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57kkn\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.092984 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.093060 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.093154 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.093224 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.093290 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.094517 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock" (OuterVolumeSpecName: "lock") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.095117 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache" (OuterVolumeSpecName: "cache") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.100832 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.101446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn" (OuterVolumeSpecName: "kube-api-access-57kkn") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "kube-api-access-57kkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.101898 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.195433 4962 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.196530 4962 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.196568 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57kkn\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.196590 4962 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.196666 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.225223 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.299077 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.457301 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.502749 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.798247 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"7ba9f2cadbb43f65e2484ea2a7184348cefa8eeb550b59455e6840526a2111e5"} Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.798338 4962 scope.go:117] "RemoveContainer" containerID="63c4d35ae203bd5ac342fa6d490352730d135f847a680bbe15aae0fe53059141" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.798415 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.820117 4962 scope.go:117] "RemoveContainer" containerID="3c297c5e3426f0b38076ba12a36de8e42599c1ec9b371d1d4ac3dc87d286fdac" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.843201 4962 scope.go:117] "RemoveContainer" containerID="6460038d74df47b4bd5e8f877737b675fdcc51257f17732080e42ee0a1e7dfa6" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.856659 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.861461 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.943822 4962 scope.go:117] "RemoveContainer" containerID="3108da3bf591571013cc25e1b8f1de0c827e10b04d9686bc5e1fb47bc9778731" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.967861 4962 scope.go:117] "RemoveContainer" containerID="3f89270dd151567356dcd4569c268792d8ce043f1e81df07ebe5f55f65531bca" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.991578 4962 scope.go:117] "RemoveContainer" containerID="05aae6f36e27022f7b4fa526f1265b47aeb3c166ab95c682c5b8f4ac82205eff" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.016913 4962 scope.go:117] "RemoveContainer" containerID="b6ead0e1bdda64a7399139dd6191cc696b570349bf204a2ab46ce0d182cc49a9" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.048341 4962 scope.go:117] "RemoveContainer" containerID="87c786369d8da7650fca3be3c67f9a8decb0d8fd88429ab357e31f9e7c19f3e0" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.075587 4962 scope.go:117] "RemoveContainer" containerID="4bc06842128d6fdcb6b37354d4c5aad1c3642acbd05e513b28a95e6f19bab1ca" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.101811 4962 scope.go:117] "RemoveContainer" containerID="8395eb871539c46360c6d66fb96850aeed91819306e7873acf83b98b89a956d8" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.144628 4962 scope.go:117] "RemoveContainer" containerID="6727a65f145335bf540a7898aeabecb549d8d22b6c9a1c79a91620a5e8e3e3f8" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.154236 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" path="/var/lib/kubelet/pods/8e8425d5-32be-4726-915a-3de5c70f0f62/volumes" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.155658 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" path="/var/lib/kubelet/pods/f4fb3b99-0e02-4c5c-9704-884ea3f0605d/volumes" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.172294 4962 scope.go:117] "RemoveContainer" containerID="1b0e56a8482d960b0917a1f3004c6a015099a8313a0f5c4fbb4d166f9d4ea11c" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.203907 4962 scope.go:117] "RemoveContainer" containerID="5d9d68ccd50ca26ce3191d56dc735011eb169a68e6eedc3144c97564be0ff601" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.233269 4962 scope.go:117] "RemoveContainer" containerID="066dce8eb5ee2a5ee4696fbdc5642875edc121ec4465ea32468ecf8aba5fbe36" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.258459 4962 scope.go:117] "RemoveContainer" containerID="138e05b5e05f4d5ae28d62c69c931e5b6907fd9792450f37e652add9de1e83a1" Feb 20 10:19:11 crc kubenswrapper[4962]: I0220 10:19:11.508894 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:19:11 crc kubenswrapper[4962]: I0220 10:19:11.509432 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:19:12 crc kubenswrapper[4962]: I0220 10:19:12.522480 4962 scope.go:117] "RemoveContainer" containerID="109a3b4f30138b426060ee3960875f54b8e50460794fa326f4252e9233232cac" Feb 20 10:19:12 crc kubenswrapper[4962]: I0220 10:19:12.558686 4962 scope.go:117] "RemoveContainer" containerID="8e1e57cd49c915d1862d936053074b6280af762ac9dd3bf4c1c80c561fca009f" Feb 20 10:19:12 crc kubenswrapper[4962]: I0220 10:19:12.600957 4962 scope.go:117] "RemoveContainer" containerID="a637cdafdb841809ec5f95151c668e1d8c78d29aabd8a60383f137a82dcb2009" Feb 20 10:19:12 crc kubenswrapper[4962]: I0220 10:19:12.644545 4962 scope.go:117] "RemoveContainer" containerID="1b60442fa3cb970cd1e3424fd12f2f5e98e959daa205ca4e27a8e01da1487e66" Feb 20 10:19:12 crc kubenswrapper[4962]: I0220 10:19:12.690382 4962 scope.go:117] "RemoveContainer" containerID="1dbe5f8319feef22f1ef43626823510cfe8e71d6a8d49cafca70087ce33b1b60" Feb 20 10:19:41 crc kubenswrapper[4962]: I0220 10:19:41.508451 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:19:41 crc kubenswrapper[4962]: I0220 10:19:41.509244 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.508418 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.509132 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.509208 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.510376 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.510476 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" gracePeriod=600 Feb 20 10:20:11 crc kubenswrapper[4962]: E0220 10:20:11.663686 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.903240 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" exitCode=0 Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.903315 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e"} Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.903370 4962 scope.go:117] "RemoveContainer" containerID="90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.904723 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:20:11 crc kubenswrapper[4962]: E0220 10:20:11.907942 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.570177 4962 scope.go:117] "RemoveContainer" containerID="06ae0aace60b853c3274af8b59ad6fe8fb46d990b1106c40d7696cbaaa47e13b" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.641560 4962 scope.go:117] "RemoveContainer" containerID="03ab33469ea979640d7188e1c0dc68dd1548a99d601929f7b4e160bee72396f3" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.677402 4962 scope.go:117] "RemoveContainer" containerID="aa045e6922dfe4d5b86be77916d3a6f56d92ad5d8849a14be83a3fc1d37883cc" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.704037 4962 scope.go:117] "RemoveContainer" containerID="1255947ebb2d1ff7325c767c453081290e37fc7eec685e64c813cb21e269d2c8" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.776706 4962 scope.go:117] "RemoveContainer" containerID="ae355b88f320e93105b216772d0d1821b9792d4ee89d86649fd430b7ae19d59e" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.836113 4962 scope.go:117] "RemoveContainer" containerID="0be6bfc0db94e6c57e1c0a4856d3600b1ea4d12d42a32685b52156cacc1224a0" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.870863 4962 scope.go:117] "RemoveContainer" containerID="53831e942d8d69707dcfe40655e43c5762a4d492f07b1c79ed7f413953ec5f61" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.905665 4962 scope.go:117] "RemoveContainer" containerID="1933a4410cc57079acebbf3cca845c0c1a3c75df94daefc5b4a3cc61d913faab" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.935382 4962 scope.go:117] "RemoveContainer" containerID="5a9782006ca96cee05b8576db8cf67f09117b6ff20027f1e9a751d12df45c5f2" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.960568 4962 scope.go:117] "RemoveContainer" containerID="a4453e5e140badbab6aa97996c8ab339f8ab22881b41594395bdb84a3005b466" Feb 20 10:20:14 crc kubenswrapper[4962]: I0220 10:20:14.007580 4962 scope.go:117] "RemoveContainer" containerID="1374063c1227f074525aeab9310be0405d817c53450d0331b45011e3f7fb82f7" Feb 20 10:20:14 crc kubenswrapper[4962]: I0220 10:20:14.043575 4962 scope.go:117] "RemoveContainer" containerID="c4884098169c655124365602e35fd187fa28c946c8e4d3fb080909fa29ad7ae0" Feb 20 10:20:14 crc kubenswrapper[4962]: I0220 10:20:14.095869 4962 scope.go:117] "RemoveContainer" containerID="d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde" Feb 20 10:20:14 crc kubenswrapper[4962]: I0220 10:20:14.128268 4962 scope.go:117] "RemoveContainer" containerID="7fd77ce11ed465ec4237e46a1c362e414960d4a8e3a2e89e44d3a98f1d109ea9" Feb 20 10:20:14 crc kubenswrapper[4962]: I0220 10:20:14.153182 4962 scope.go:117] "RemoveContainer" containerID="2c027b22cf0ba460d458ecf5143a855bd6cabc995b34bcff27678d1a95ac71b9" Feb 20 10:20:24 crc kubenswrapper[4962]: I0220 10:20:24.138856 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:20:24 crc kubenswrapper[4962]: E0220 10:20:24.139822 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:20:39 crc kubenswrapper[4962]: I0220 10:20:39.146825 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:20:39 crc kubenswrapper[4962]: E0220 10:20:39.150266 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:20:52 crc kubenswrapper[4962]: I0220 10:20:52.139007 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:20:52 crc kubenswrapper[4962]: E0220 10:20:52.141656 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:21:07 crc kubenswrapper[4962]: I0220 10:21:07.139500 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:21:07 crc kubenswrapper[4962]: E0220 10:21:07.140720 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.498227 4962 scope.go:117] "RemoveContainer" containerID="6f635f1f56319fca1af13c4d65bb4a7c7d012f95348309539e66bb9bc3885680" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.551317 4962 scope.go:117] "RemoveContainer" containerID="a45c4081d1cfd44304d7f3d8b40910079cb39e233843a73a0bea91a01d00d686" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.592562 4962 scope.go:117] "RemoveContainer" containerID="c4560e14774e3c9741c91f46ea630363e7cc5935a06c720a5d083bca786e716f" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.626742 4962 scope.go:117] "RemoveContainer" containerID="f9cb69ce2f5869e2d5aa8f13c96033f3ed4a62ca0344285f07875e14d0de4351" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.697725 4962 scope.go:117] "RemoveContainer" containerID="bfe2a2311075991b6e26f61913d5319a6a3da98a5127862535ec8779ac2e9fce" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.725902 4962 scope.go:117] "RemoveContainer" containerID="ce2c059ffddb8a4bd817e0bdef157eb8b02fa711cf3898a972dc3c9f08da8952" Feb 20 10:21:19 crc kubenswrapper[4962]: I0220 10:21:19.146865 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:21:19 crc kubenswrapper[4962]: E0220 10:21:19.149495 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:21:30 crc kubenswrapper[4962]: I0220 10:21:30.139431 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:21:30 crc kubenswrapper[4962]: E0220 10:21:30.140641 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:21:43 crc kubenswrapper[4962]: I0220 10:21:43.139535 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:21:43 crc kubenswrapper[4962]: E0220 10:21:43.142434 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:21:58 crc kubenswrapper[4962]: I0220 10:21:58.139170 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:21:58 crc kubenswrapper[4962]: E0220 10:21:58.139783 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:22:09 crc kubenswrapper[4962]: I0220 10:22:09.147309 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:22:09 crc kubenswrapper[4962]: E0220 10:22:09.148337 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:22:14 crc kubenswrapper[4962]: I0220 10:22:14.873505 4962 scope.go:117] "RemoveContainer" containerID="bc2106ad3cc4af20a5e2c1213babb01b766f8accacbdaf4870b68d6cbc722d49" Feb 20 10:22:14 crc kubenswrapper[4962]: I0220 10:22:14.909469 4962 scope.go:117] "RemoveContainer" containerID="d5939f243f85e996e6d1902bb72680f0a5c1df9ab42c709cd744434161fb2db0" Feb 20 10:22:14 crc kubenswrapper[4962]: I0220 10:22:14.938490 4962 scope.go:117] "RemoveContainer" containerID="e14d4499aad39130d8942043e6328de4fbc415b007670a0434fde9be884215b2" Feb 20 10:22:14 crc kubenswrapper[4962]: I0220 10:22:14.975867 4962 scope.go:117] "RemoveContainer" containerID="e1788ed30c723d96dcb6e0f9484b28a97145a65cc9e3bff73edd5bbbf2ff0b13" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.032705 4962 scope.go:117] "RemoveContainer" containerID="3cc79122882da35c12762f52d1de73bf1a9ef430f240e775b44faf92fe147dab" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.089532 4962 scope.go:117] "RemoveContainer" containerID="b8af66136e35fc73b3d51b7e67a7d05bebe4ecd8a5ad20c914388c6152b5d470" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.114317 4962 scope.go:117] "RemoveContainer" containerID="e3155d74dd6282e1fc794d27b2b712bbcb47529b5f2fbb3e8f768bc271110d45" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.136861 4962 scope.go:117] "RemoveContainer" containerID="793db344d89e9339466a1f19a2e137b204724f58c385b41b9c74536f0d99e12b" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.179166 4962 scope.go:117] "RemoveContainer" containerID="0c7306cb64431bbfbfccbec9d4784b736bd29c8703a60d357986ec36fd19a276" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.203077 4962 scope.go:117] "RemoveContainer" containerID="7230277cc1eb3909a3d3342c6f5ba88bcf14bbf39fe46da73616efba87702b09" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.229523 4962 scope.go:117] "RemoveContainer" containerID="28df8a32fe5a1bd334afa755bb83b0ac292979f42bd8a6975cdf978af2b8b6b7" Feb 20 10:22:24 crc kubenswrapper[4962]: I0220 10:22:24.138910 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:22:24 crc kubenswrapper[4962]: E0220 10:22:24.139773 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:22:39 crc kubenswrapper[4962]: I0220 10:22:39.146311 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:22:39 crc kubenswrapper[4962]: E0220 10:22:39.147417 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:22:51 crc kubenswrapper[4962]: I0220 10:22:51.139059 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:22:51 crc kubenswrapper[4962]: E0220 10:22:51.140119 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:23:05 crc kubenswrapper[4962]: I0220 10:23:05.139138 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:23:05 crc kubenswrapper[4962]: E0220 10:23:05.140313 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:23:15 crc kubenswrapper[4962]: I0220 10:23:15.403870 4962 scope.go:117] "RemoveContainer" containerID="1eb9947e80af1012b6145dccb54cd11c0689239b2a15c94816fdca73015d8cfe" Feb 20 10:23:15 crc kubenswrapper[4962]: I0220 10:23:15.464453 4962 scope.go:117] "RemoveContainer" containerID="c43d694b0ea8172a2db698ac63ac57a6cb364529c6c87ffb777fc946029b6b2f" Feb 20 10:23:16 crc kubenswrapper[4962]: I0220 10:23:16.139356 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:23:16 crc kubenswrapper[4962]: E0220 10:23:16.139781 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:23:27 crc kubenswrapper[4962]: I0220 10:23:27.139060 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:23:27 crc kubenswrapper[4962]: E0220 10:23:27.139709 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:23:41 crc kubenswrapper[4962]: I0220 10:23:41.140931 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:23:41 crc kubenswrapper[4962]: E0220 10:23:41.142617 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:23:52 crc kubenswrapper[4962]: I0220 10:23:52.139075 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:23:52 crc kubenswrapper[4962]: E0220 10:23:52.140216 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:24:03 crc kubenswrapper[4962]: I0220 10:24:03.139810 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:24:03 crc kubenswrapper[4962]: E0220 10:24:03.140917 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:24:15 crc kubenswrapper[4962]: I0220 10:24:15.139522 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:24:15 crc kubenswrapper[4962]: E0220 10:24:15.140692 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:24:15 crc kubenswrapper[4962]: I0220 10:24:15.557744 4962 scope.go:117] "RemoveContainer" containerID="d6bf8640027e8b75225f36e2b4a5d790818a0e4259c4c5012d627c79a493efb3" Feb 20 10:24:15 crc kubenswrapper[4962]: I0220 10:24:15.586825 4962 scope.go:117] "RemoveContainer" containerID="42f33c3ac4e84257c4f38d060186abe1300d7dfb20f8894c1b519bb38d1529c9" Feb 20 10:24:15 crc kubenswrapper[4962]: I0220 10:24:15.621416 4962 scope.go:117] "RemoveContainer" containerID="f9e4860c3043e0b48490e065c36e81c4bc365aa4ac0725e20676491c7054e577" Feb 20 10:24:26 crc kubenswrapper[4962]: I0220 10:24:26.139541 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:24:26 crc kubenswrapper[4962]: E0220 10:24:26.140664 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:24:38 crc kubenswrapper[4962]: I0220 10:24:38.139209 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:24:38 crc kubenswrapper[4962]: E0220 10:24:38.139962 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:24:50 crc kubenswrapper[4962]: I0220 10:24:50.140172 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:24:50 crc kubenswrapper[4962]: E0220 10:24:50.141502 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:25:04 crc kubenswrapper[4962]: I0220 10:25:04.139937 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:25:04 crc kubenswrapper[4962]: E0220 10:25:04.141113 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:25:15 crc kubenswrapper[4962]: I0220 10:25:15.138673 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:25:16 crc kubenswrapper[4962]: I0220 10:25:16.079333 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde"} Feb 20 10:27:41 crc kubenswrapper[4962]: I0220 10:27:41.508122 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:27:41 crc kubenswrapper[4962]: I0220 10:27:41.509153 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:28:11 crc kubenswrapper[4962]: I0220 10:28:11.508042 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:28:11 crc kubenswrapper[4962]: I0220 10:28:11.508991 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.065564 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066679 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066694 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066706 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="setup-container" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066712 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="setup-container" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066720 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066731 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-server" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066746 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066755 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066770 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066776 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066797 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066805 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066827 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="setup-container" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066834 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="setup-container" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066841 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" containerName="kube-state-metrics" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066848 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" containerName="kube-state-metrics" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066866 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerName="nova-cell1-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066873 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerName="nova-cell1-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066892 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="mysql-bootstrap" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066898 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="mysql-bootstrap" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066914 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066920 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-api" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066939 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-central-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066945 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-central-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066956 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066963 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066970 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b22a9e86-ccdf-4505-8116-21b0230943fc" containerName="memcached" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066978 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b22a9e86-ccdf-4505-8116-21b0230943fc" containerName="memcached" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066991 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-reaper" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066997 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-reaper" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067020 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" containerName="keystone-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067025 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" containerName="keystone-api" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067033 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="rsync" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067039 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="rsync" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067049 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067056 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-api" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067069 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067075 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067087 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="sg-core" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067093 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="sg-core" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067108 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067114 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067127 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067135 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067143 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-notification-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067150 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-notification-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067169 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerName="nova-cell0-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067175 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerName="nova-cell0-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067192 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067198 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067209 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067215 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067228 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067234 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067245 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="ovn-northd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067251 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="ovn-northd" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067264 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067270 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067279 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-expirer" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067287 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-expirer" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067294 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067299 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067310 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server-init" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067316 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server-init" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067331 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="swift-recon-cron" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067337 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="swift-recon-cron" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067345 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067351 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067368 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067374 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067382 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067388 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-server" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067406 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="openstack-network-exporter" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067413 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="openstack-network-exporter" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067424 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="proxy-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067430 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="proxy-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067441 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067447 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-log" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067458 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067464 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067476 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="galera" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067482 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="galera" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067495 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067501 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-server" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067516 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067522 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067530 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067536 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067547 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067553 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067564 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067570 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068227 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068243 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068256 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068275 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="swift-recon-cron" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068286 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068302 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068310 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068325 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068333 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068349 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068361 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068373 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068381 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068392 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068400 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068416 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-expirer" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068426 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068437 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068448 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="galera" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068464 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068479 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-central-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068493 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="openstack-network-exporter" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068499 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068514 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerName="nova-cell1-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068527 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068541 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerName="nova-cell0-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068553 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068564 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="sg-core" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068575 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068598 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b22a9e86-ccdf-4505-8116-21b0230943fc" containerName="memcached" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068620 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068631 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-reaper" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068642 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" containerName="kube-state-metrics" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068655 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="rsync" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068666 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068673 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="ovn-northd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068682 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068695 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-notification-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068710 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068718 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068730 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" containerName="keystone-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068743 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="proxy-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.072342 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.091549 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.170211 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.170312 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.170441 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpzrd\" (UniqueName: \"kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.272067 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpzrd\" (UniqueName: \"kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.272174 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.272223 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.272806 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.272984 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.299736 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpzrd\" (UniqueName: \"kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.412219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.864336 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:20 crc kubenswrapper[4962]: I0220 10:28:20.769766 4962 generic.go:334] "Generic (PLEG): container finished" podID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerID="d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87" exitCode=0 Feb 20 10:28:20 crc kubenswrapper[4962]: I0220 10:28:20.769910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerDied","Data":"d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87"} Feb 20 10:28:20 crc kubenswrapper[4962]: I0220 10:28:20.770447 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerStarted","Data":"5fe37c57c2b2b7862de3baa3e03886ddbe7805805cfcf43d366e424522152507"} Feb 20 10:28:20 crc kubenswrapper[4962]: I0220 10:28:20.776493 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:28:21 crc kubenswrapper[4962]: I0220 10:28:21.783985 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerStarted","Data":"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e"} Feb 20 10:28:22 crc kubenswrapper[4962]: I0220 10:28:22.797892 4962 generic.go:334] "Generic (PLEG): container finished" podID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerID="100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e" exitCode=0 Feb 20 10:28:22 crc kubenswrapper[4962]: I0220 10:28:22.797970 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerDied","Data":"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e"} Feb 20 10:28:23 crc kubenswrapper[4962]: I0220 10:28:23.814117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerStarted","Data":"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800"} Feb 20 10:28:23 crc kubenswrapper[4962]: I0220 10:28:23.844800 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-twkms" podStartSLOduration=2.340427011 podStartE2EDuration="4.844775004s" podCreationTimestamp="2026-02-20 10:28:19 +0000 UTC" firstStartedPulling="2026-02-20 10:28:20.776195936 +0000 UTC m=+1992.358667792" lastFinishedPulling="2026-02-20 10:28:23.280543909 +0000 UTC m=+1994.863015785" observedRunningTime="2026-02-20 10:28:23.836912299 +0000 UTC m=+1995.419384185" watchObservedRunningTime="2026-02-20 10:28:23.844775004 +0000 UTC m=+1995.427246880" Feb 20 10:28:29 crc kubenswrapper[4962]: I0220 10:28:29.412397 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:29 crc kubenswrapper[4962]: I0220 10:28:29.412951 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:30 crc kubenswrapper[4962]: I0220 10:28:30.479680 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-twkms" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="registry-server" probeResult="failure" output=< Feb 20 10:28:30 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 10:28:30 crc kubenswrapper[4962]: > Feb 20 10:28:39 crc kubenswrapper[4962]: I0220 10:28:39.498822 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:39 crc kubenswrapper[4962]: I0220 10:28:39.584206 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:39 crc kubenswrapper[4962]: I0220 10:28:39.760941 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:40 crc kubenswrapper[4962]: I0220 10:28:40.986751 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-twkms" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="registry-server" containerID="cri-o://5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800" gracePeriod=2 Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.484746 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.508383 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.508459 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.508514 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.509270 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.509342 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde" gracePeriod=600 Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.556423 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities\") pod \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.556525 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpzrd\" (UniqueName: \"kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd\") pod \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.556587 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content\") pod \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.557487 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities" (OuterVolumeSpecName: "utilities") pod "319aafc1-a34d-45d0-9b00-67b3c80f3f04" (UID: "319aafc1-a34d-45d0-9b00-67b3c80f3f04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.564112 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd" (OuterVolumeSpecName: "kube-api-access-dpzrd") pod "319aafc1-a34d-45d0-9b00-67b3c80f3f04" (UID: "319aafc1-a34d-45d0-9b00-67b3c80f3f04"). InnerVolumeSpecName "kube-api-access-dpzrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.658794 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.658830 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpzrd\" (UniqueName: \"kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd\") on node \"crc\" DevicePath \"\"" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.695034 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "319aafc1-a34d-45d0-9b00-67b3c80f3f04" (UID: "319aafc1-a34d-45d0-9b00-67b3c80f3f04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.759949 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.001521 4962 generic.go:334] "Generic (PLEG): container finished" podID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerID="5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800" exitCode=0 Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.001644 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.001677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerDied","Data":"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800"} Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.001777 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerDied","Data":"5fe37c57c2b2b7862de3baa3e03886ddbe7805805cfcf43d366e424522152507"} Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.001817 4962 scope.go:117] "RemoveContainer" containerID="5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.015225 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde" exitCode=0 Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.015324 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde"} Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.015389 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e"} Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.050034 4962 scope.go:117] "RemoveContainer" containerID="100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.073669 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.081094 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.097925 4962 scope.go:117] "RemoveContainer" containerID="d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.131891 4962 scope.go:117] "RemoveContainer" containerID="5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800" Feb 20 10:28:42 crc kubenswrapper[4962]: E0220 10:28:42.132526 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800\": container with ID starting with 5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800 not found: ID does not exist" containerID="5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.132569 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800"} err="failed to get container status \"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800\": rpc error: code = NotFound desc = could not find container \"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800\": container with ID starting with 5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800 not found: ID does not exist" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.132617 4962 scope.go:117] "RemoveContainer" containerID="100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e" Feb 20 10:28:42 crc kubenswrapper[4962]: E0220 10:28:42.133120 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e\": container with ID starting with 100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e not found: ID does not exist" containerID="100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.133181 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e"} err="failed to get container status \"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e\": rpc error: code = NotFound desc = could not find container \"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e\": container with ID starting with 100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e not found: ID does not exist" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.133225 4962 scope.go:117] "RemoveContainer" containerID="d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87" Feb 20 10:28:42 crc kubenswrapper[4962]: E0220 10:28:42.133618 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87\": container with ID starting with d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87 not found: ID does not exist" containerID="d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.133655 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87"} err="failed to get container status \"d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87\": rpc error: code = NotFound desc = could not find container \"d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87\": container with ID starting with d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87 not found: ID does not exist" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.133682 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:28:43 crc kubenswrapper[4962]: I0220 10:28:43.159200 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" path="/var/lib/kubelet/pods/319aafc1-a34d-45d0-9b00-67b3c80f3f04/volumes" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.791863 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:28:57 crc kubenswrapper[4962]: E0220 10:28:57.792944 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="extract-utilities" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.792967 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="extract-utilities" Feb 20 10:28:57 crc kubenswrapper[4962]: E0220 10:28:57.793010 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="extract-content" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.793023 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="extract-content" Feb 20 10:28:57 crc kubenswrapper[4962]: E0220 10:28:57.793058 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="registry-server" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.793071 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="registry-server" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.793320 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="registry-server" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.794957 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.810110 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.816978 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.817170 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9279p\" (UniqueName: \"kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.817292 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.921171 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.921245 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9279p\" (UniqueName: \"kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.921290 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.921946 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.922556 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.960552 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9279p\" (UniqueName: \"kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:58 crc kubenswrapper[4962]: I0220 10:28:58.142073 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:58 crc kubenswrapper[4962]: I0220 10:28:58.672233 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.221093 4962 generic.go:334] "Generic (PLEG): container finished" podID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerID="ba830112dfbb46386f795bf9c8f766dbff593db0dd17e9fef4db3b3bebe42597" exitCode=0 Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.221160 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerDied","Data":"ba830112dfbb46386f795bf9c8f766dbff593db0dd17e9fef4db3b3bebe42597"} Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.221556 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerStarted","Data":"de425b43216f19572591d858be95124b141a268bfc57571deba58215fe7f8d3f"} Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.783804 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.786479 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.823397 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.955300 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhpt6\" (UniqueName: \"kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.955357 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.955528 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.056663 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.056837 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhpt6\" (UniqueName: \"kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.056882 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.057195 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.057350 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.082778 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhpt6\" (UniqueName: \"kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.115704 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.239094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerStarted","Data":"8292db70a0fb7ff1771aff36e2f05b918952395a507e66a9122380a220a3b5b5"} Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.401285 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.776428 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.778970 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.791471 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.970811 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.970951 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.971008 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk7z6\" (UniqueName: \"kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.072491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.072563 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk7z6\" (UniqueName: \"kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.072709 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.073133 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.073203 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.094864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk7z6\" (UniqueName: \"kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.100083 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.256679 4962 generic.go:334] "Generic (PLEG): container finished" podID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerID="f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9" exitCode=0 Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.256788 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerDied","Data":"f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9"} Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.256823 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerStarted","Data":"19dcaa5f1fe9bc0e1a14e304278ba9fabd067499af81e2b79740a27bccc7d78d"} Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.260964 4962 generic.go:334] "Generic (PLEG): container finished" podID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerID="8292db70a0fb7ff1771aff36e2f05b918952395a507e66a9122380a220a3b5b5" exitCode=0 Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.261031 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerDied","Data":"8292db70a0fb7ff1771aff36e2f05b918952395a507e66a9122380a220a3b5b5"} Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.368693 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.271020 4962 generic.go:334] "Generic (PLEG): container finished" podID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerID="18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c" exitCode=0 Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.271254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerDied","Data":"18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c"} Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.276507 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerStarted","Data":"1aaf871d746903f293b5cfb06ce4404d68e4327388a3f7920a13533a4b903fd1"} Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.278853 4962 generic.go:334] "Generic (PLEG): container finished" podID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerID="8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0" exitCode=0 Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.278922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerDied","Data":"8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0"} Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.278967 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerStarted","Data":"cb0f58c8e15de5b78552c1a1def14285bd1745ff1b57dce0feda8de2b6f2a54a"} Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.375358 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k5bgh" podStartSLOduration=2.941778397 podStartE2EDuration="5.375326751s" podCreationTimestamp="2026-02-20 10:28:57 +0000 UTC" firstStartedPulling="2026-02-20 10:28:59.223112417 +0000 UTC m=+2030.805584273" lastFinishedPulling="2026-02-20 10:29:01.656660741 +0000 UTC m=+2033.239132627" observedRunningTime="2026-02-20 10:29:02.368209209 +0000 UTC m=+2033.950681105" watchObservedRunningTime="2026-02-20 10:29:02.375326751 +0000 UTC m=+2033.957798637" Feb 20 10:29:03 crc kubenswrapper[4962]: I0220 10:29:03.290894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerStarted","Data":"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187"} Feb 20 10:29:03 crc kubenswrapper[4962]: I0220 10:29:03.293586 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerStarted","Data":"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e"} Feb 20 10:29:03 crc kubenswrapper[4962]: I0220 10:29:03.307866 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f9bxk" podStartSLOduration=2.870389904 podStartE2EDuration="4.307844367s" podCreationTimestamp="2026-02-20 10:28:59 +0000 UTC" firstStartedPulling="2026-02-20 10:29:01.259473105 +0000 UTC m=+2032.841944951" lastFinishedPulling="2026-02-20 10:29:02.696927538 +0000 UTC m=+2034.279399414" observedRunningTime="2026-02-20 10:29:03.304294437 +0000 UTC m=+2034.886766293" watchObservedRunningTime="2026-02-20 10:29:03.307844367 +0000 UTC m=+2034.890316223" Feb 20 10:29:04 crc kubenswrapper[4962]: I0220 10:29:04.305654 4962 generic.go:334] "Generic (PLEG): container finished" podID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerID="2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e" exitCode=0 Feb 20 10:29:04 crc kubenswrapper[4962]: I0220 10:29:04.305768 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerDied","Data":"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e"} Feb 20 10:29:05 crc kubenswrapper[4962]: I0220 10:29:05.316826 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerStarted","Data":"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec"} Feb 20 10:29:05 crc kubenswrapper[4962]: I0220 10:29:05.351718 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-72m89" podStartSLOduration=2.914972255 podStartE2EDuration="5.351694609s" podCreationTimestamp="2026-02-20 10:29:00 +0000 UTC" firstStartedPulling="2026-02-20 10:29:02.281989309 +0000 UTC m=+2033.864461195" lastFinishedPulling="2026-02-20 10:29:04.718711703 +0000 UTC m=+2036.301183549" observedRunningTime="2026-02-20 10:29:05.34690308 +0000 UTC m=+2036.929374936" watchObservedRunningTime="2026-02-20 10:29:05.351694609 +0000 UTC m=+2036.934166465" Feb 20 10:29:08 crc kubenswrapper[4962]: I0220 10:29:08.143074 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:08 crc kubenswrapper[4962]: I0220 10:29:08.143559 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:08 crc kubenswrapper[4962]: I0220 10:29:08.223710 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:08 crc kubenswrapper[4962]: I0220 10:29:08.421814 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:10 crc kubenswrapper[4962]: I0220 10:29:10.116873 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:10 crc kubenswrapper[4962]: I0220 10:29:10.117322 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:10 crc kubenswrapper[4962]: I0220 10:29:10.197053 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:10 crc kubenswrapper[4962]: I0220 10:29:10.438093 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:11 crc kubenswrapper[4962]: I0220 10:29:11.100660 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:11 crc kubenswrapper[4962]: I0220 10:29:11.100739 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:11 crc kubenswrapper[4962]: I0220 10:29:11.174999 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:11 crc kubenswrapper[4962]: I0220 10:29:11.416065 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:12 crc kubenswrapper[4962]: I0220 10:29:12.973913 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:29:12 crc kubenswrapper[4962]: I0220 10:29:12.974271 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k5bgh" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="registry-server" containerID="cri-o://1aaf871d746903f293b5cfb06ce4404d68e4327388a3f7920a13533a4b903fd1" gracePeriod=2 Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.398520 4962 generic.go:334] "Generic (PLEG): container finished" podID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerID="1aaf871d746903f293b5cfb06ce4404d68e4327388a3f7920a13533a4b903fd1" exitCode=0 Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.398573 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerDied","Data":"1aaf871d746903f293b5cfb06ce4404d68e4327388a3f7920a13533a4b903fd1"} Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.523756 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.578931 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.579391 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-72m89" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="registry-server" containerID="cri-o://ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec" gracePeriod=2 Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.717168 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities\") pod \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.717435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9279p\" (UniqueName: \"kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p\") pod \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.717487 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content\") pod \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.718154 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities" (OuterVolumeSpecName: "utilities") pod "fe8d972b-b855-4f15-b7ad-10530b2b31ed" (UID: "fe8d972b-b855-4f15-b7ad-10530b2b31ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.726206 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p" (OuterVolumeSpecName: "kube-api-access-9279p") pod "fe8d972b-b855-4f15-b7ad-10530b2b31ed" (UID: "fe8d972b-b855-4f15-b7ad-10530b2b31ed"). InnerVolumeSpecName "kube-api-access-9279p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.773772 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe8d972b-b855-4f15-b7ad-10530b2b31ed" (UID: "fe8d972b-b855-4f15-b7ad-10530b2b31ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.819349 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9279p\" (UniqueName: \"kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.819379 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.819388 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.050979 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.224424 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities\") pod \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.224523 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk7z6\" (UniqueName: \"kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6\") pod \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.224920 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content\") pod \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.225791 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities" (OuterVolumeSpecName: "utilities") pod "8cbf1964-98e2-467d-ba01-7724a5f1a71c" (UID: "8cbf1964-98e2-467d-ba01-7724a5f1a71c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.231106 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6" (OuterVolumeSpecName: "kube-api-access-nk7z6") pod "8cbf1964-98e2-467d-ba01-7724a5f1a71c" (UID: "8cbf1964-98e2-467d-ba01-7724a5f1a71c"). InnerVolumeSpecName "kube-api-access-nk7z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.273646 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cbf1964-98e2-467d-ba01-7724a5f1a71c" (UID: "8cbf1964-98e2-467d-ba01-7724a5f1a71c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.327181 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.327225 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.327246 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk7z6\" (UniqueName: \"kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.412721 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.412723 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerDied","Data":"de425b43216f19572591d858be95124b141a268bfc57571deba58215fe7f8d3f"} Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.412951 4962 scope.go:117] "RemoveContainer" containerID="1aaf871d746903f293b5cfb06ce4404d68e4327388a3f7920a13533a4b903fd1" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.417859 4962 generic.go:334] "Generic (PLEG): container finished" podID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerID="ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec" exitCode=0 Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.417902 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerDied","Data":"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec"} Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.417958 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerDied","Data":"cb0f58c8e15de5b78552c1a1def14285bd1745ff1b57dce0feda8de2b6f2a54a"} Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.418244 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.451989 4962 scope.go:117] "RemoveContainer" containerID="8292db70a0fb7ff1771aff36e2f05b918952395a507e66a9122380a220a3b5b5" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.471384 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.483728 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.492976 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.495067 4962 scope.go:117] "RemoveContainer" containerID="ba830112dfbb46386f795bf9c8f766dbff593db0dd17e9fef4db3b3bebe42597" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.499050 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.530369 4962 scope.go:117] "RemoveContainer" containerID="ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.567237 4962 scope.go:117] "RemoveContainer" containerID="2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.591581 4962 scope.go:117] "RemoveContainer" containerID="8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.638274 4962 scope.go:117] "RemoveContainer" containerID="ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec" Feb 20 10:29:14 crc kubenswrapper[4962]: E0220 10:29:14.638876 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec\": container with ID starting with ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec not found: ID does not exist" containerID="ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.638921 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec"} err="failed to get container status \"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec\": rpc error: code = NotFound desc = could not find container \"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec\": container with ID starting with ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec not found: ID does not exist" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.638951 4962 scope.go:117] "RemoveContainer" containerID="2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e" Feb 20 10:29:14 crc kubenswrapper[4962]: E0220 10:29:14.639500 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e\": container with ID starting with 2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e not found: ID does not exist" containerID="2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.639561 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e"} err="failed to get container status \"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e\": rpc error: code = NotFound desc = could not find container \"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e\": container with ID starting with 2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e not found: ID does not exist" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.639635 4962 scope.go:117] "RemoveContainer" containerID="8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0" Feb 20 10:29:14 crc kubenswrapper[4962]: E0220 10:29:14.640169 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0\": container with ID starting with 8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0 not found: ID does not exist" containerID="8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.640347 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0"} err="failed to get container status \"8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0\": rpc error: code = NotFound desc = could not find container \"8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0\": container with ID starting with 8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0 not found: ID does not exist" Feb 20 10:29:15 crc kubenswrapper[4962]: I0220 10:29:15.153492 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" path="/var/lib/kubelet/pods/8cbf1964-98e2-467d-ba01-7724a5f1a71c/volumes" Feb 20 10:29:15 crc kubenswrapper[4962]: I0220 10:29:15.156658 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" path="/var/lib/kubelet/pods/fe8d972b-b855-4f15-b7ad-10530b2b31ed/volumes" Feb 20 10:29:17 crc kubenswrapper[4962]: I0220 10:29:17.582382 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:29:17 crc kubenswrapper[4962]: I0220 10:29:17.583248 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f9bxk" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="registry-server" containerID="cri-o://2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187" gracePeriod=2 Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.157895 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.295550 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities\") pod \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.295798 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content\") pod \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.295899 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhpt6\" (UniqueName: \"kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6\") pod \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.297004 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities" (OuterVolumeSpecName: "utilities") pod "128802f0-1918-4aaf-bff0-fd43fe96a1ac" (UID: "128802f0-1918-4aaf-bff0-fd43fe96a1ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.304923 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6" (OuterVolumeSpecName: "kube-api-access-nhpt6") pod "128802f0-1918-4aaf-bff0-fd43fe96a1ac" (UID: "128802f0-1918-4aaf-bff0-fd43fe96a1ac"). InnerVolumeSpecName "kube-api-access-nhpt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.380657 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "128802f0-1918-4aaf-bff0-fd43fe96a1ac" (UID: "128802f0-1918-4aaf-bff0-fd43fe96a1ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.397772 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.397806 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhpt6\" (UniqueName: \"kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.397822 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.463812 4962 generic.go:334] "Generic (PLEG): container finished" podID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerID="2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187" exitCode=0 Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.463871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerDied","Data":"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187"} Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.463911 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerDied","Data":"19dcaa5f1fe9bc0e1a14e304278ba9fabd067499af81e2b79740a27bccc7d78d"} Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.463945 4962 scope.go:117] "RemoveContainer" containerID="2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.464103 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.505926 4962 scope.go:117] "RemoveContainer" containerID="18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.518710 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.528447 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.541683 4962 scope.go:117] "RemoveContainer" containerID="f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.578969 4962 scope.go:117] "RemoveContainer" containerID="2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187" Feb 20 10:29:18 crc kubenswrapper[4962]: E0220 10:29:18.579624 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187\": container with ID starting with 2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187 not found: ID does not exist" containerID="2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.579724 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187"} err="failed to get container status \"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187\": rpc error: code = NotFound desc = could not find container \"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187\": container with ID starting with 2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187 not found: ID does not exist" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.579776 4962 scope.go:117] "RemoveContainer" containerID="18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c" Feb 20 10:29:18 crc kubenswrapper[4962]: E0220 10:29:18.580907 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c\": container with ID starting with 18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c not found: ID does not exist" containerID="18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.581032 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c"} err="failed to get container status \"18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c\": rpc error: code = NotFound desc = could not find container \"18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c\": container with ID starting with 18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c not found: ID does not exist" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.581079 4962 scope.go:117] "RemoveContainer" containerID="f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9" Feb 20 10:29:18 crc kubenswrapper[4962]: E0220 10:29:18.581879 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9\": container with ID starting with f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9 not found: ID does not exist" containerID="f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.581923 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9"} err="failed to get container status \"f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9\": rpc error: code = NotFound desc = could not find container \"f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9\": container with ID starting with f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9 not found: ID does not exist" Feb 20 10:29:19 crc kubenswrapper[4962]: I0220 10:29:19.150410 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" path="/var/lib/kubelet/pods/128802f0-1918-4aaf-bff0-fd43fe96a1ac/volumes" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.163903 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg"] Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167264 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167298 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167328 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167346 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167385 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167407 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167429 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167446 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167500 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167518 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167541 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167557 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167582 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167631 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167662 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167679 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167720 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167737 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.168062 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.168095 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.168116 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.169125 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.174093 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.177792 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg"] Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.178666 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.276274 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.276769 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.276865 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxz89\" (UniqueName: \"kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.378650 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.378782 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxz89\" (UniqueName: \"kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.378881 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.380349 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.392315 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.402568 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxz89\" (UniqueName: \"kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.503984 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.994450 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg"] Feb 20 10:30:01 crc kubenswrapper[4962]: I0220 10:30:01.864870 4962 generic.go:334] "Generic (PLEG): container finished" podID="6a57c99f-e682-43fc-85be-d6ca9b32dd2e" containerID="672fb8e4a3790f1f70ac1c9ed16383d55019a5b81bdd7e7049f12caa51ab0535" exitCode=0 Feb 20 10:30:01 crc kubenswrapper[4962]: I0220 10:30:01.864971 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" event={"ID":"6a57c99f-e682-43fc-85be-d6ca9b32dd2e","Type":"ContainerDied","Data":"672fb8e4a3790f1f70ac1c9ed16383d55019a5b81bdd7e7049f12caa51ab0535"} Feb 20 10:30:01 crc kubenswrapper[4962]: I0220 10:30:01.865222 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" event={"ID":"6a57c99f-e682-43fc-85be-d6ca9b32dd2e","Type":"ContainerStarted","Data":"98d481bd65aecc0a3a37c338808380116b6fcb68eddb9e99a185f9fe8e8723cd"} Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.296717 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.427252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume\") pod \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.427388 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume\") pod \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.427533 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxz89\" (UniqueName: \"kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89\") pod \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.428434 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume" (OuterVolumeSpecName: "config-volume") pod "6a57c99f-e682-43fc-85be-d6ca9b32dd2e" (UID: "6a57c99f-e682-43fc-85be-d6ca9b32dd2e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.432652 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6a57c99f-e682-43fc-85be-d6ca9b32dd2e" (UID: "6a57c99f-e682-43fc-85be-d6ca9b32dd2e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.434557 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89" (OuterVolumeSpecName: "kube-api-access-rxz89") pod "6a57c99f-e682-43fc-85be-d6ca9b32dd2e" (UID: "6a57c99f-e682-43fc-85be-d6ca9b32dd2e"). InnerVolumeSpecName "kube-api-access-rxz89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.529662 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.529697 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.529711 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxz89\" (UniqueName: \"kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89\") on node \"crc\" DevicePath \"\"" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.887647 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" event={"ID":"6a57c99f-e682-43fc-85be-d6ca9b32dd2e","Type":"ContainerDied","Data":"98d481bd65aecc0a3a37c338808380116b6fcb68eddb9e99a185f9fe8e8723cd"} Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.887709 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d481bd65aecc0a3a37c338808380116b6fcb68eddb9e99a185f9fe8e8723cd" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.887737 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:04 crc kubenswrapper[4962]: I0220 10:30:04.401068 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw"] Feb 20 10:30:04 crc kubenswrapper[4962]: I0220 10:30:04.411313 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw"] Feb 20 10:30:05 crc kubenswrapper[4962]: I0220 10:30:05.156655 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3652dbd-dae4-462b-be88-b8a782de8a1c" path="/var/lib/kubelet/pods/a3652dbd-dae4-462b-be88-b8a782de8a1c/volumes" Feb 20 10:30:15 crc kubenswrapper[4962]: I0220 10:30:15.880676 4962 scope.go:117] "RemoveContainer" containerID="e8531bc42f535f5fdb200b255f9b4197b26b17fb00311859ea4571ac343f8767" Feb 20 10:30:41 crc kubenswrapper[4962]: I0220 10:30:41.507855 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:30:41 crc kubenswrapper[4962]: I0220 10:30:41.508739 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:31:11 crc kubenswrapper[4962]: I0220 10:31:11.508078 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:31:11 crc kubenswrapper[4962]: I0220 10:31:11.508758 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.508112 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.508828 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.508919 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.509887 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.509990 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" gracePeriod=600 Feb 20 10:31:41 crc kubenswrapper[4962]: E0220 10:31:41.641421 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.792759 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" exitCode=0 Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.792783 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e"} Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.792947 4962 scope.go:117] "RemoveContainer" containerID="c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.794536 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:31:41 crc kubenswrapper[4962]: E0220 10:31:41.795776 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:31:53 crc kubenswrapper[4962]: I0220 10:31:53.139088 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:31:53 crc kubenswrapper[4962]: E0220 10:31:53.140143 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:32:06 crc kubenswrapper[4962]: I0220 10:32:06.139035 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:32:06 crc kubenswrapper[4962]: E0220 10:32:06.140247 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:32:18 crc kubenswrapper[4962]: I0220 10:32:18.138881 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:32:18 crc kubenswrapper[4962]: E0220 10:32:18.140219 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:32:30 crc kubenswrapper[4962]: I0220 10:32:30.141023 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:32:30 crc kubenswrapper[4962]: E0220 10:32:30.141907 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:32:43 crc kubenswrapper[4962]: I0220 10:32:43.139012 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:32:43 crc kubenswrapper[4962]: E0220 10:32:43.140293 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:32:58 crc kubenswrapper[4962]: I0220 10:32:58.138994 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:32:58 crc kubenswrapper[4962]: E0220 10:32:58.142314 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:33:13 crc kubenswrapper[4962]: I0220 10:33:13.138907 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:33:13 crc kubenswrapper[4962]: E0220 10:33:13.139510 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:33:28 crc kubenswrapper[4962]: I0220 10:33:28.139223 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:33:28 crc kubenswrapper[4962]: E0220 10:33:28.140351 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:33:39 crc kubenswrapper[4962]: I0220 10:33:39.147952 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:33:39 crc kubenswrapper[4962]: E0220 10:33:39.149191 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:33:53 crc kubenswrapper[4962]: I0220 10:33:53.139175 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:33:53 crc kubenswrapper[4962]: E0220 10:33:53.140509 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:34:07 crc kubenswrapper[4962]: I0220 10:34:07.139640 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:34:07 crc kubenswrapper[4962]: E0220 10:34:07.140620 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:34:20 crc kubenswrapper[4962]: I0220 10:34:20.138944 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:34:20 crc kubenswrapper[4962]: E0220 10:34:20.139944 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:34:32 crc kubenswrapper[4962]: I0220 10:34:32.139517 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:34:32 crc kubenswrapper[4962]: E0220 10:34:32.140847 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:34:43 crc kubenswrapper[4962]: I0220 10:34:43.139574 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:34:43 crc kubenswrapper[4962]: E0220 10:34:43.140861 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:34:57 crc kubenswrapper[4962]: I0220 10:34:57.139662 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:34:57 crc kubenswrapper[4962]: E0220 10:34:57.141151 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:35:10 crc kubenswrapper[4962]: I0220 10:35:10.139956 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:35:10 crc kubenswrapper[4962]: E0220 10:35:10.141277 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:35:25 crc kubenswrapper[4962]: I0220 10:35:25.139517 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:35:25 crc kubenswrapper[4962]: E0220 10:35:25.140718 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:35:37 crc kubenswrapper[4962]: I0220 10:35:37.141912 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:35:37 crc kubenswrapper[4962]: E0220 10:35:37.143024 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:35:49 crc kubenswrapper[4962]: I0220 10:35:49.146211 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:35:49 crc kubenswrapper[4962]: E0220 10:35:49.146945 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:36:03 crc kubenswrapper[4962]: I0220 10:36:03.140042 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:36:03 crc kubenswrapper[4962]: E0220 10:36:03.141153 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:36:18 crc kubenswrapper[4962]: I0220 10:36:18.139502 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:36:18 crc kubenswrapper[4962]: E0220 10:36:18.140470 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:36:31 crc kubenswrapper[4962]: I0220 10:36:31.140200 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:36:31 crc kubenswrapper[4962]: E0220 10:36:31.141559 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:36:44 crc kubenswrapper[4962]: I0220 10:36:44.138999 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:36:44 crc kubenswrapper[4962]: I0220 10:36:44.616481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f"} Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.403409 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:38:55 crc kubenswrapper[4962]: E0220 10:38:55.404434 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a57c99f-e682-43fc-85be-d6ca9b32dd2e" containerName="collect-profiles" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.404457 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a57c99f-e682-43fc-85be-d6ca9b32dd2e" containerName="collect-profiles" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.404649 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a57c99f-e682-43fc-85be-d6ca9b32dd2e" containerName="collect-profiles" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.405929 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.410553 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.410958 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.411106 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf299\" (UniqueName: \"kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.418441 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.511857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.511917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf299\" (UniqueName: \"kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.511966 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.512438 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.512522 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.551432 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf299\" (UniqueName: \"kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.735369 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:56 crc kubenswrapper[4962]: I0220 10:38:56.276248 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:38:56 crc kubenswrapper[4962]: I0220 10:38:56.791411 4962 generic.go:334] "Generic (PLEG): container finished" podID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerID="ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82" exitCode=0 Feb 20 10:38:56 crc kubenswrapper[4962]: I0220 10:38:56.791465 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerDied","Data":"ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82"} Feb 20 10:38:56 crc kubenswrapper[4962]: I0220 10:38:56.791532 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerStarted","Data":"b1a939161b7b0cf6af29a1b96d2fbba2e4c10ba8217fb1c3bc0cdfa63f4d008d"} Feb 20 10:38:56 crc kubenswrapper[4962]: I0220 10:38:56.793366 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:38:57 crc kubenswrapper[4962]: I0220 10:38:57.803435 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerStarted","Data":"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d"} Feb 20 10:38:58 crc kubenswrapper[4962]: I0220 10:38:58.813655 4962 generic.go:334] "Generic (PLEG): container finished" podID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerID="8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d" exitCode=0 Feb 20 10:38:58 crc kubenswrapper[4962]: I0220 10:38:58.813744 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerDied","Data":"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d"} Feb 20 10:38:59 crc kubenswrapper[4962]: I0220 10:38:59.823907 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerStarted","Data":"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c"} Feb 20 10:38:59 crc kubenswrapper[4962]: I0220 10:38:59.855365 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nnkmg" podStartSLOduration=2.417982249 podStartE2EDuration="4.855348187s" podCreationTimestamp="2026-02-20 10:38:55 +0000 UTC" firstStartedPulling="2026-02-20 10:38:56.793174521 +0000 UTC m=+2628.375646367" lastFinishedPulling="2026-02-20 10:38:59.230540449 +0000 UTC m=+2630.813012305" observedRunningTime="2026-02-20 10:38:59.850884109 +0000 UTC m=+2631.433355995" watchObservedRunningTime="2026-02-20 10:38:59.855348187 +0000 UTC m=+2631.437820043" Feb 20 10:39:05 crc kubenswrapper[4962]: I0220 10:39:05.736324 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:05 crc kubenswrapper[4962]: I0220 10:39:05.737459 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:06 crc kubenswrapper[4962]: I0220 10:39:06.808224 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nnkmg" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="registry-server" probeResult="failure" output=< Feb 20 10:39:06 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 10:39:06 crc kubenswrapper[4962]: > Feb 20 10:39:11 crc kubenswrapper[4962]: I0220 10:39:11.508196 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:39:11 crc kubenswrapper[4962]: I0220 10:39:11.508627 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:39:15 crc kubenswrapper[4962]: I0220 10:39:15.806075 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:15 crc kubenswrapper[4962]: I0220 10:39:15.869162 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:16 crc kubenswrapper[4962]: I0220 10:39:16.058278 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:39:16 crc kubenswrapper[4962]: I0220 10:39:16.983265 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nnkmg" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="registry-server" containerID="cri-o://f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c" gracePeriod=2 Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.473498 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.486708 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities\") pod \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.486924 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf299\" (UniqueName: \"kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299\") pod \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.486951 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content\") pod \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.490268 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities" (OuterVolumeSpecName: "utilities") pod "a186f0cc-4ee5-4c45-9bf0-49f496ed709b" (UID: "a186f0cc-4ee5-4c45-9bf0-49f496ed709b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.494179 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299" (OuterVolumeSpecName: "kube-api-access-rf299") pod "a186f0cc-4ee5-4c45-9bf0-49f496ed709b" (UID: "a186f0cc-4ee5-4c45-9bf0-49f496ed709b"). InnerVolumeSpecName "kube-api-access-rf299". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.588334 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf299\" (UniqueName: \"kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.588372 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.634416 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a186f0cc-4ee5-4c45-9bf0-49f496ed709b" (UID: "a186f0cc-4ee5-4c45-9bf0-49f496ed709b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.689298 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.997085 4962 generic.go:334] "Generic (PLEG): container finished" podID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerID="f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c" exitCode=0 Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.997132 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerDied","Data":"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c"} Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.997159 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerDied","Data":"b1a939161b7b0cf6af29a1b96d2fbba2e4c10ba8217fb1c3bc0cdfa63f4d008d"} Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.997161 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.997198 4962 scope.go:117] "RemoveContainer" containerID="f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.043880 4962 scope.go:117] "RemoveContainer" containerID="8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.047011 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.057095 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.076934 4962 scope.go:117] "RemoveContainer" containerID="ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.109816 4962 scope.go:117] "RemoveContainer" containerID="f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c" Feb 20 10:39:18 crc kubenswrapper[4962]: E0220 10:39:18.110573 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c\": container with ID starting with f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c not found: ID does not exist" containerID="f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.110800 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c"} err="failed to get container status \"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c\": rpc error: code = NotFound desc = could not find container \"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c\": container with ID starting with f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c not found: ID does not exist" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.111087 4962 scope.go:117] "RemoveContainer" containerID="8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d" Feb 20 10:39:18 crc kubenswrapper[4962]: E0220 10:39:18.112018 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d\": container with ID starting with 8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d not found: ID does not exist" containerID="8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.112078 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d"} err="failed to get container status \"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d\": rpc error: code = NotFound desc = could not find container \"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d\": container with ID starting with 8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d not found: ID does not exist" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.112120 4962 scope.go:117] "RemoveContainer" containerID="ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82" Feb 20 10:39:18 crc kubenswrapper[4962]: E0220 10:39:18.112535 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82\": container with ID starting with ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82 not found: ID does not exist" containerID="ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.112717 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82"} err="failed to get container status \"ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82\": rpc error: code = NotFound desc = could not find container \"ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82\": container with ID starting with ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82 not found: ID does not exist" Feb 20 10:39:19 crc kubenswrapper[4962]: I0220 10:39:19.152634 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" path="/var/lib/kubelet/pods/a186f0cc-4ee5-4c45-9bf0-49f496ed709b/volumes" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.705218 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:37 crc kubenswrapper[4962]: E0220 10:39:37.706411 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="extract-content" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.706434 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="extract-content" Feb 20 10:39:37 crc kubenswrapper[4962]: E0220 10:39:37.706456 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="registry-server" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.706471 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="registry-server" Feb 20 10:39:37 crc kubenswrapper[4962]: E0220 10:39:37.706508 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="extract-utilities" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.706524 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="extract-utilities" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.706848 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="registry-server" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.708700 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.722329 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.728207 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkcz4\" (UniqueName: \"kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.728280 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.728501 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.830601 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.830737 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkcz4\" (UniqueName: \"kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.830773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.831434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.831445 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.862500 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkcz4\" (UniqueName: \"kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:38 crc kubenswrapper[4962]: I0220 10:39:38.032336 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:38 crc kubenswrapper[4962]: I0220 10:39:38.286454 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:39 crc kubenswrapper[4962]: I0220 10:39:39.191546 4962 generic.go:334] "Generic (PLEG): container finished" podID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerID="cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60" exitCode=0 Feb 20 10:39:39 crc kubenswrapper[4962]: I0220 10:39:39.191637 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerDied","Data":"cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60"} Feb 20 10:39:39 crc kubenswrapper[4962]: I0220 10:39:39.191664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerStarted","Data":"ee15b0b06731212c4b415a3b4b111773ba27381417897da4370a68c602214e9d"} Feb 20 10:39:40 crc kubenswrapper[4962]: I0220 10:39:40.199425 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerStarted","Data":"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7"} Feb 20 10:39:41 crc kubenswrapper[4962]: I0220 10:39:41.216552 4962 generic.go:334] "Generic (PLEG): container finished" podID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerID="8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7" exitCode=0 Feb 20 10:39:41 crc kubenswrapper[4962]: I0220 10:39:41.216646 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerDied","Data":"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7"} Feb 20 10:39:41 crc kubenswrapper[4962]: I0220 10:39:41.508170 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:39:41 crc kubenswrapper[4962]: I0220 10:39:41.508245 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:39:42 crc kubenswrapper[4962]: I0220 10:39:42.230001 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerStarted","Data":"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e"} Feb 20 10:39:42 crc kubenswrapper[4962]: I0220 10:39:42.272179 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xw4c8" podStartSLOduration=2.713510833 podStartE2EDuration="5.272144447s" podCreationTimestamp="2026-02-20 10:39:37 +0000 UTC" firstStartedPulling="2026-02-20 10:39:39.199944389 +0000 UTC m=+2670.782416265" lastFinishedPulling="2026-02-20 10:39:41.758578023 +0000 UTC m=+2673.341049879" observedRunningTime="2026-02-20 10:39:42.256833872 +0000 UTC m=+2673.839305758" watchObservedRunningTime="2026-02-20 10:39:42.272144447 +0000 UTC m=+2673.854616323" Feb 20 10:39:48 crc kubenswrapper[4962]: I0220 10:39:48.032695 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:48 crc kubenswrapper[4962]: I0220 10:39:48.033342 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:48 crc kubenswrapper[4962]: I0220 10:39:48.111116 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:48 crc kubenswrapper[4962]: I0220 10:39:48.370966 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:48 crc kubenswrapper[4962]: I0220 10:39:48.437763 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:50 crc kubenswrapper[4962]: I0220 10:39:50.317685 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xw4c8" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="registry-server" containerID="cri-o://589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e" gracePeriod=2 Feb 20 10:39:50 crc kubenswrapper[4962]: I0220 10:39:50.857918 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.044097 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkcz4\" (UniqueName: \"kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4\") pod \"015b23ac-0880-43e0-b6f1-cfc724c572df\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.044729 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content\") pod \"015b23ac-0880-43e0-b6f1-cfc724c572df\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.045073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities\") pod \"015b23ac-0880-43e0-b6f1-cfc724c572df\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.046233 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities" (OuterVolumeSpecName: "utilities") pod "015b23ac-0880-43e0-b6f1-cfc724c572df" (UID: "015b23ac-0880-43e0-b6f1-cfc724c572df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.053831 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4" (OuterVolumeSpecName: "kube-api-access-xkcz4") pod "015b23ac-0880-43e0-b6f1-cfc724c572df" (UID: "015b23ac-0880-43e0-b6f1-cfc724c572df"). InnerVolumeSpecName "kube-api-access-xkcz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.125891 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "015b23ac-0880-43e0-b6f1-cfc724c572df" (UID: "015b23ac-0880-43e0-b6f1-cfc724c572df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.146898 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkcz4\" (UniqueName: \"kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.146943 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.146962 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.330887 4962 generic.go:334] "Generic (PLEG): container finished" podID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerID="589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e" exitCode=0 Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.330946 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerDied","Data":"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e"} Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.330982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerDied","Data":"ee15b0b06731212c4b415a3b4b111773ba27381417897da4370a68c602214e9d"} Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.331009 4962 scope.go:117] "RemoveContainer" containerID="589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.331191 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.369633 4962 scope.go:117] "RemoveContainer" containerID="8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.519647 4962 scope.go:117] "RemoveContainer" containerID="cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.522796 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.533791 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.539934 4962 scope.go:117] "RemoveContainer" containerID="589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e" Feb 20 10:39:51 crc kubenswrapper[4962]: E0220 10:39:51.540384 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e\": container with ID starting with 589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e not found: ID does not exist" containerID="589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.540423 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e"} err="failed to get container status \"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e\": rpc error: code = NotFound desc = could not find container \"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e\": container with ID starting with 589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e not found: ID does not exist" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.540455 4962 scope.go:117] "RemoveContainer" containerID="8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7" Feb 20 10:39:51 crc kubenswrapper[4962]: E0220 10:39:51.540872 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7\": container with ID starting with 8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7 not found: ID does not exist" containerID="8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.540899 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7"} err="failed to get container status \"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7\": rpc error: code = NotFound desc = could not find container \"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7\": container with ID starting with 8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7 not found: ID does not exist" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.540917 4962 scope.go:117] "RemoveContainer" containerID="cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60" Feb 20 10:39:51 crc kubenswrapper[4962]: E0220 10:39:51.542202 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60\": container with ID starting with cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60 not found: ID does not exist" containerID="cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.542250 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60"} err="failed to get container status \"cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60\": rpc error: code = NotFound desc = could not find container \"cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60\": container with ID starting with cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60 not found: ID does not exist" Feb 20 10:39:53 crc kubenswrapper[4962]: I0220 10:39:53.149614 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" path="/var/lib/kubelet/pods/015b23ac-0880-43e0-b6f1-cfc724c572df/volumes" Feb 20 10:40:11 crc kubenswrapper[4962]: I0220 10:40:11.508312 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:40:11 crc kubenswrapper[4962]: I0220 10:40:11.510734 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:40:11 crc kubenswrapper[4962]: I0220 10:40:11.510927 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:40:11 crc kubenswrapper[4962]: I0220 10:40:11.512011 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:40:11 crc kubenswrapper[4962]: I0220 10:40:11.512307 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f" gracePeriod=600 Feb 20 10:40:12 crc kubenswrapper[4962]: I0220 10:40:12.540142 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f" exitCode=0 Feb 20 10:40:12 crc kubenswrapper[4962]: I0220 10:40:12.540229 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f"} Feb 20 10:40:12 crc kubenswrapper[4962]: I0220 10:40:12.540564 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6"} Feb 20 10:40:12 crc kubenswrapper[4962]: I0220 10:40:12.540626 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:42:11 crc kubenswrapper[4962]: I0220 10:42:11.508573 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:42:11 crc kubenswrapper[4962]: I0220 10:42:11.509263 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:42:41 crc kubenswrapper[4962]: I0220 10:42:41.508161 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:42:41 crc kubenswrapper[4962]: I0220 10:42:41.508861 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.219577 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:09 crc kubenswrapper[4962]: E0220 10:43:09.220574 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="registry-server" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.220963 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="registry-server" Feb 20 10:43:09 crc kubenswrapper[4962]: E0220 10:43:09.221005 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="extract-content" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.221017 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="extract-content" Feb 20 10:43:09 crc kubenswrapper[4962]: E0220 10:43:09.221030 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="extract-utilities" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.221038 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="extract-utilities" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.221223 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="registry-server" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.222831 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.235161 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.301687 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ckqx\" (UniqueName: \"kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.301741 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.301782 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.403433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ckqx\" (UniqueName: \"kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.403543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.403622 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.404169 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.404347 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.427580 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ckqx\" (UniqueName: \"kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.543978 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.832672 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:10 crc kubenswrapper[4962]: I0220 10:43:10.137892 4962 generic.go:334] "Generic (PLEG): container finished" podID="2df2af41-d54c-427f-91e9-b132958cb597" containerID="2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3" exitCode=0 Feb 20 10:43:10 crc kubenswrapper[4962]: I0220 10:43:10.137929 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerDied","Data":"2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3"} Feb 20 10:43:10 crc kubenswrapper[4962]: I0220 10:43:10.137953 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerStarted","Data":"f279a063082fe55c064c2ef5edc718798c07fda7d9d3ba9b3569442ae2603b1d"} Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.156373 4962 generic.go:334] "Generic (PLEG): container finished" podID="2df2af41-d54c-427f-91e9-b132958cb597" containerID="8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b" exitCode=0 Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.158803 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerDied","Data":"8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b"} Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.508667 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.509011 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.509086 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.510096 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.510218 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" gracePeriod=600 Feb 20 10:43:11 crc kubenswrapper[4962]: E0220 10:43:11.654681 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.167379 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" exitCode=0 Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.167477 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6"} Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.167519 4962 scope.go:117] "RemoveContainer" containerID="42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.168050 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:43:12 crc kubenswrapper[4962]: E0220 10:43:12.168254 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.170710 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerStarted","Data":"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f"} Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.229528 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.231540 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.240901 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.242469 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4hqgh" podStartSLOduration=1.834561292 podStartE2EDuration="3.24244878s" podCreationTimestamp="2026-02-20 10:43:09 +0000 UTC" firstStartedPulling="2026-02-20 10:43:10.139339773 +0000 UTC m=+2881.721811619" lastFinishedPulling="2026-02-20 10:43:11.547227231 +0000 UTC m=+2883.129699107" observedRunningTime="2026-02-20 10:43:12.23177347 +0000 UTC m=+2883.814245316" watchObservedRunningTime="2026-02-20 10:43:12.24244878 +0000 UTC m=+2883.824920636" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.246617 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.246665 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qghs\" (UniqueName: \"kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.246731 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.348272 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.348378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.348403 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qghs\" (UniqueName: \"kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.348788 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.348919 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.365322 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qghs\" (UniqueName: \"kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.553732 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:13 crc kubenswrapper[4962]: I0220 10:43:13.027279 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:13 crc kubenswrapper[4962]: I0220 10:43:13.181551 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerStarted","Data":"a6a7be7df0f34627d00563e97fbdf2a8011dd3ee63192c4a0e2864c2140079df"} Feb 20 10:43:14 crc kubenswrapper[4962]: I0220 10:43:14.191391 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerID="46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8" exitCode=0 Feb 20 10:43:14 crc kubenswrapper[4962]: I0220 10:43:14.191453 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerDied","Data":"46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8"} Feb 20 10:43:15 crc kubenswrapper[4962]: I0220 10:43:15.205008 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerID="43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad" exitCode=0 Feb 20 10:43:15 crc kubenswrapper[4962]: I0220 10:43:15.205174 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerDied","Data":"43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad"} Feb 20 10:43:16 crc kubenswrapper[4962]: I0220 10:43:16.230121 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerStarted","Data":"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79"} Feb 20 10:43:16 crc kubenswrapper[4962]: I0220 10:43:16.258014 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jb2lz" podStartSLOduration=2.8271723619999998 podStartE2EDuration="4.25799914s" podCreationTimestamp="2026-02-20 10:43:12 +0000 UTC" firstStartedPulling="2026-02-20 10:43:14.193992834 +0000 UTC m=+2885.776464710" lastFinishedPulling="2026-02-20 10:43:15.624819602 +0000 UTC m=+2887.207291488" observedRunningTime="2026-02-20 10:43:16.25639464 +0000 UTC m=+2887.838866496" watchObservedRunningTime="2026-02-20 10:43:16.25799914 +0000 UTC m=+2887.840470986" Feb 20 10:43:19 crc kubenswrapper[4962]: I0220 10:43:19.544349 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:19 crc kubenswrapper[4962]: I0220 10:43:19.544818 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:19 crc kubenswrapper[4962]: I0220 10:43:19.629023 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:20 crc kubenswrapper[4962]: I0220 10:43:20.324086 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:22 crc kubenswrapper[4962]: I0220 10:43:22.554404 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:22 crc kubenswrapper[4962]: I0220 10:43:22.554840 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:22 crc kubenswrapper[4962]: I0220 10:43:22.627898 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:23 crc kubenswrapper[4962]: I0220 10:43:23.140278 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:43:23 crc kubenswrapper[4962]: E0220 10:43:23.140680 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:43:23 crc kubenswrapper[4962]: I0220 10:43:23.364868 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.396374 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.396622 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4hqgh" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="registry-server" containerID="cri-o://6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f" gracePeriod=2 Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.876570 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.988532 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities\") pod \"2df2af41-d54c-427f-91e9-b132958cb597\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.988670 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content\") pod \"2df2af41-d54c-427f-91e9-b132958cb597\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.988745 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ckqx\" (UniqueName: \"kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx\") pod \"2df2af41-d54c-427f-91e9-b132958cb597\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.990843 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities" (OuterVolumeSpecName: "utilities") pod "2df2af41-d54c-427f-91e9-b132958cb597" (UID: "2df2af41-d54c-427f-91e9-b132958cb597"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.999488 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx" (OuterVolumeSpecName: "kube-api-access-5ckqx") pod "2df2af41-d54c-427f-91e9-b132958cb597" (UID: "2df2af41-d54c-427f-91e9-b132958cb597"). InnerVolumeSpecName "kube-api-access-5ckqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.087338 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2df2af41-d54c-427f-91e9-b132958cb597" (UID: "2df2af41-d54c-427f-91e9-b132958cb597"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.094310 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.094375 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ckqx\" (UniqueName: \"kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.094410 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.318884 4962 generic.go:334] "Generic (PLEG): container finished" podID="2df2af41-d54c-427f-91e9-b132958cb597" containerID="6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f" exitCode=0 Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.319046 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.319655 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerDied","Data":"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f"} Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.319923 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerDied","Data":"f279a063082fe55c064c2ef5edc718798c07fda7d9d3ba9b3569442ae2603b1d"} Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.320009 4962 scope.go:117] "RemoveContainer" containerID="6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.349834 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.355817 4962 scope.go:117] "RemoveContainer" containerID="8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.357456 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.384792 4962 scope.go:117] "RemoveContainer" containerID="2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.435186 4962 scope.go:117] "RemoveContainer" containerID="6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f" Feb 20 10:43:25 crc kubenswrapper[4962]: E0220 10:43:25.435823 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f\": container with ID starting with 6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f not found: ID does not exist" containerID="6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.435887 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f"} err="failed to get container status \"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f\": rpc error: code = NotFound desc = could not find container \"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f\": container with ID starting with 6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f not found: ID does not exist" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.435931 4962 scope.go:117] "RemoveContainer" containerID="8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b" Feb 20 10:43:25 crc kubenswrapper[4962]: E0220 10:43:25.436535 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b\": container with ID starting with 8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b not found: ID does not exist" containerID="8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.436735 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b"} err="failed to get container status \"8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b\": rpc error: code = NotFound desc = could not find container \"8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b\": container with ID starting with 8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b not found: ID does not exist" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.436781 4962 scope.go:117] "RemoveContainer" containerID="2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3" Feb 20 10:43:25 crc kubenswrapper[4962]: E0220 10:43:25.437359 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3\": container with ID starting with 2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3 not found: ID does not exist" containerID="2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.437405 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3"} err="failed to get container status \"2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3\": rpc error: code = NotFound desc = could not find container \"2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3\": container with ID starting with 2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3 not found: ID does not exist" Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.154774 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df2af41-d54c-427f-91e9-b132958cb597" path="/var/lib/kubelet/pods/2df2af41-d54c-427f-91e9-b132958cb597/volumes" Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.412355 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.412715 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jb2lz" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="registry-server" containerID="cri-o://84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79" gracePeriod=2 Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.899238 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.943993 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities\") pod \"ff762b96-9d98-406b-81b4-b81b19473e0e\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.944118 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qghs\" (UniqueName: \"kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs\") pod \"ff762b96-9d98-406b-81b4-b81b19473e0e\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.944224 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content\") pod \"ff762b96-9d98-406b-81b4-b81b19473e0e\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.945328 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities" (OuterVolumeSpecName: "utilities") pod "ff762b96-9d98-406b-81b4-b81b19473e0e" (UID: "ff762b96-9d98-406b-81b4-b81b19473e0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.952039 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs" (OuterVolumeSpecName: "kube-api-access-2qghs") pod "ff762b96-9d98-406b-81b4-b81b19473e0e" (UID: "ff762b96-9d98-406b-81b4-b81b19473e0e"). InnerVolumeSpecName "kube-api-access-2qghs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.983148 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff762b96-9d98-406b-81b4-b81b19473e0e" (UID: "ff762b96-9d98-406b-81b4-b81b19473e0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.046091 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.046134 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qghs\" (UniqueName: \"kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.046152 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.352434 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerID="84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79" exitCode=0 Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.352512 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.352525 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerDied","Data":"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79"} Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.353306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerDied","Data":"a6a7be7df0f34627d00563e97fbdf2a8011dd3ee63192c4a0e2864c2140079df"} Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.353357 4962 scope.go:117] "RemoveContainer" containerID="84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.387228 4962 scope.go:117] "RemoveContainer" containerID="43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.416525 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.425110 4962 scope.go:117] "RemoveContainer" containerID="46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.430282 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.466727 4962 scope.go:117] "RemoveContainer" containerID="84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79" Feb 20 10:43:28 crc kubenswrapper[4962]: E0220 10:43:28.467755 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79\": container with ID starting with 84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79 not found: ID does not exist" containerID="84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.467841 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79"} err="failed to get container status \"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79\": rpc error: code = NotFound desc = could not find container \"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79\": container with ID starting with 84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79 not found: ID does not exist" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.467884 4962 scope.go:117] "RemoveContainer" containerID="43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad" Feb 20 10:43:28 crc kubenswrapper[4962]: E0220 10:43:28.468367 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad\": container with ID starting with 43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad not found: ID does not exist" containerID="43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.468429 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad"} err="failed to get container status \"43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad\": rpc error: code = NotFound desc = could not find container \"43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad\": container with ID starting with 43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad not found: ID does not exist" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.468468 4962 scope.go:117] "RemoveContainer" containerID="46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8" Feb 20 10:43:28 crc kubenswrapper[4962]: E0220 10:43:28.469444 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8\": container with ID starting with 46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8 not found: ID does not exist" containerID="46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.469500 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8"} err="failed to get container status \"46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8\": rpc error: code = NotFound desc = could not find container \"46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8\": container with ID starting with 46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8 not found: ID does not exist" Feb 20 10:43:29 crc kubenswrapper[4962]: I0220 10:43:29.155452 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" path="/var/lib/kubelet/pods/ff762b96-9d98-406b-81b4-b81b19473e0e/volumes" Feb 20 10:43:36 crc kubenswrapper[4962]: I0220 10:43:36.138806 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:43:36 crc kubenswrapper[4962]: E0220 10:43:36.141452 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:43:51 crc kubenswrapper[4962]: I0220 10:43:51.139925 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:43:51 crc kubenswrapper[4962]: E0220 10:43:51.140935 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:44:03 crc kubenswrapper[4962]: I0220 10:44:03.139512 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:44:03 crc kubenswrapper[4962]: E0220 10:44:03.140084 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:44:15 crc kubenswrapper[4962]: I0220 10:44:15.139323 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:44:15 crc kubenswrapper[4962]: E0220 10:44:15.140949 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:44:29 crc kubenswrapper[4962]: I0220 10:44:29.148879 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:44:29 crc kubenswrapper[4962]: E0220 10:44:29.149831 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:44:43 crc kubenswrapper[4962]: I0220 10:44:43.139860 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:44:43 crc kubenswrapper[4962]: E0220 10:44:43.142267 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:44:57 crc kubenswrapper[4962]: I0220 10:44:57.138930 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:44:57 crc kubenswrapper[4962]: E0220 10:44:57.140202 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.169073 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg"] Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.169907 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.169934 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.169955 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="extract-content" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.169968 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="extract-content" Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.170000 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="extract-utilities" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170083 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="extract-utilities" Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.170142 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="extract-utilities" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170156 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="extract-utilities" Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.170176 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170192 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.170227 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="extract-content" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170239 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="extract-content" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170516 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170554 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.171301 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.174908 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.174967 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.181427 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg"] Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.269025 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmhl8\" (UniqueName: \"kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.269099 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.269390 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.370632 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmhl8\" (UniqueName: \"kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.370734 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.370960 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.372216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.389782 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.401789 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmhl8\" (UniqueName: \"kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.509287 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.805768 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg"] Feb 20 10:45:01 crc kubenswrapper[4962]: I0220 10:45:01.222410 4962 generic.go:334] "Generic (PLEG): container finished" podID="96b7caa8-b0e3-456c-88a2-da2e0e66d681" containerID="8e3990ca316d03e419fd0ebb840f5195fe2ec512b8606d9e1a57ef65654b00e5" exitCode=0 Feb 20 10:45:01 crc kubenswrapper[4962]: I0220 10:45:01.222550 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" event={"ID":"96b7caa8-b0e3-456c-88a2-da2e0e66d681","Type":"ContainerDied","Data":"8e3990ca316d03e419fd0ebb840f5195fe2ec512b8606d9e1a57ef65654b00e5"} Feb 20 10:45:01 crc kubenswrapper[4962]: I0220 10:45:01.222784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" event={"ID":"96b7caa8-b0e3-456c-88a2-da2e0e66d681","Type":"ContainerStarted","Data":"70795cd15c887297d3c34c43572bd8b4b2ba6d456853ef1131e299d78063bee8"} Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.583905 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.705675 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmhl8\" (UniqueName: \"kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8\") pod \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.705917 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume\") pod \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.705988 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume\") pod \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.707195 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume" (OuterVolumeSpecName: "config-volume") pod "96b7caa8-b0e3-456c-88a2-da2e0e66d681" (UID: "96b7caa8-b0e3-456c-88a2-da2e0e66d681"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.714682 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "96b7caa8-b0e3-456c-88a2-da2e0e66d681" (UID: "96b7caa8-b0e3-456c-88a2-da2e0e66d681"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.714742 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8" (OuterVolumeSpecName: "kube-api-access-qmhl8") pod "96b7caa8-b0e3-456c-88a2-da2e0e66d681" (UID: "96b7caa8-b0e3-456c-88a2-da2e0e66d681"). InnerVolumeSpecName "kube-api-access-qmhl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.807834 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.807888 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.807907 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmhl8\" (UniqueName: \"kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8\") on node \"crc\" DevicePath \"\"" Feb 20 10:45:03 crc kubenswrapper[4962]: I0220 10:45:03.245090 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" event={"ID":"96b7caa8-b0e3-456c-88a2-da2e0e66d681","Type":"ContainerDied","Data":"70795cd15c887297d3c34c43572bd8b4b2ba6d456853ef1131e299d78063bee8"} Feb 20 10:45:03 crc kubenswrapper[4962]: I0220 10:45:03.245148 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:03 crc kubenswrapper[4962]: I0220 10:45:03.245171 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70795cd15c887297d3c34c43572bd8b4b2ba6d456853ef1131e299d78063bee8" Feb 20 10:45:03 crc kubenswrapper[4962]: I0220 10:45:03.676115 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52"] Feb 20 10:45:03 crc kubenswrapper[4962]: I0220 10:45:03.684874 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52"] Feb 20 10:45:05 crc kubenswrapper[4962]: I0220 10:45:05.155752 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" path="/var/lib/kubelet/pods/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e/volumes" Feb 20 10:45:10 crc kubenswrapper[4962]: I0220 10:45:10.138652 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:45:10 crc kubenswrapper[4962]: E0220 10:45:10.139366 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:45:16 crc kubenswrapper[4962]: I0220 10:45:16.294235 4962 scope.go:117] "RemoveContainer" containerID="9e4ff8bca8b9c2f6e4f08722be3898de4e9890a93fcdc65b9b078dd1d1fbdae2" Feb 20 10:45:22 crc kubenswrapper[4962]: I0220 10:45:22.139563 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:45:22 crc kubenswrapper[4962]: E0220 10:45:22.142026 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:45:35 crc kubenswrapper[4962]: I0220 10:45:35.140091 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:45:35 crc kubenswrapper[4962]: E0220 10:45:35.141118 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:45:47 crc kubenswrapper[4962]: I0220 10:45:47.138791 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:45:47 crc kubenswrapper[4962]: E0220 10:45:47.139491 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:46:01 crc kubenswrapper[4962]: I0220 10:46:01.140719 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:46:01 crc kubenswrapper[4962]: E0220 10:46:01.141636 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:46:16 crc kubenswrapper[4962]: I0220 10:46:16.139190 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:46:16 crc kubenswrapper[4962]: E0220 10:46:16.140302 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:46:31 crc kubenswrapper[4962]: I0220 10:46:31.139548 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:46:31 crc kubenswrapper[4962]: E0220 10:46:31.140575 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:46:45 crc kubenswrapper[4962]: I0220 10:46:45.141647 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:46:45 crc kubenswrapper[4962]: E0220 10:46:45.142570 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:46:56 crc kubenswrapper[4962]: I0220 10:46:56.139544 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:46:56 crc kubenswrapper[4962]: E0220 10:46:56.140669 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:47:07 crc kubenswrapper[4962]: I0220 10:47:07.139845 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:47:07 crc kubenswrapper[4962]: E0220 10:47:07.140866 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:47:21 crc kubenswrapper[4962]: I0220 10:47:21.139360 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:47:21 crc kubenswrapper[4962]: E0220 10:47:21.140631 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:47:33 crc kubenswrapper[4962]: I0220 10:47:33.139891 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:47:33 crc kubenswrapper[4962]: E0220 10:47:33.141133 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:47:47 crc kubenswrapper[4962]: I0220 10:47:47.139552 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:47:47 crc kubenswrapper[4962]: E0220 10:47:47.140659 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:48:02 crc kubenswrapper[4962]: I0220 10:48:02.139045 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:48:02 crc kubenswrapper[4962]: E0220 10:48:02.140037 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:48:14 crc kubenswrapper[4962]: I0220 10:48:14.138705 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:48:14 crc kubenswrapper[4962]: I0220 10:48:14.977010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059"} Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.386679 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:19 crc kubenswrapper[4962]: E0220 10:49:19.387453 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b7caa8-b0e3-456c-88a2-da2e0e66d681" containerName="collect-profiles" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.387467 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b7caa8-b0e3-456c-88a2-da2e0e66d681" containerName="collect-profiles" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.387652 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b7caa8-b0e3-456c-88a2-da2e0e66d681" containerName="collect-profiles" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.388738 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.408369 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.535652 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.535711 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.535759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4tfx\" (UniqueName: \"kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.637320 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.637360 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.637391 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4tfx\" (UniqueName: \"kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.638030 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.638237 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.687401 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4tfx\" (UniqueName: \"kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.719171 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:20 crc kubenswrapper[4962]: I0220 10:49:20.131902 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:20 crc kubenswrapper[4962]: I0220 10:49:20.550406 4962 generic.go:334] "Generic (PLEG): container finished" podID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerID="65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f" exitCode=0 Feb 20 10:49:20 crc kubenswrapper[4962]: I0220 10:49:20.550451 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerDied","Data":"65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f"} Feb 20 10:49:20 crc kubenswrapper[4962]: I0220 10:49:20.550505 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerStarted","Data":"9cdd274a165a9e9b2e89ca49f289b6d7fe464745bb688d5b654d0aaabd0db097"} Feb 20 10:49:20 crc kubenswrapper[4962]: I0220 10:49:20.552686 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:49:21 crc kubenswrapper[4962]: I0220 10:49:21.559512 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerStarted","Data":"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89"} Feb 20 10:49:22 crc kubenswrapper[4962]: I0220 10:49:22.573050 4962 generic.go:334] "Generic (PLEG): container finished" podID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerID="dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89" exitCode=0 Feb 20 10:49:22 crc kubenswrapper[4962]: I0220 10:49:22.573186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerDied","Data":"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89"} Feb 20 10:49:23 crc kubenswrapper[4962]: I0220 10:49:23.582991 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerStarted","Data":"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e"} Feb 20 10:49:23 crc kubenswrapper[4962]: I0220 10:49:23.612160 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ksv8l" podStartSLOduration=2.177121477 podStartE2EDuration="4.612132471s" podCreationTimestamp="2026-02-20 10:49:19 +0000 UTC" firstStartedPulling="2026-02-20 10:49:20.552303606 +0000 UTC m=+3252.134775462" lastFinishedPulling="2026-02-20 10:49:22.98731457 +0000 UTC m=+3254.569786456" observedRunningTime="2026-02-20 10:49:23.610439988 +0000 UTC m=+3255.192911864" watchObservedRunningTime="2026-02-20 10:49:23.612132471 +0000 UTC m=+3255.194604347" Feb 20 10:49:29 crc kubenswrapper[4962]: I0220 10:49:29.720018 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:29 crc kubenswrapper[4962]: I0220 10:49:29.721169 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:30 crc kubenswrapper[4962]: I0220 10:49:30.794341 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ksv8l" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="registry-server" probeResult="failure" output=< Feb 20 10:49:30 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 10:49:30 crc kubenswrapper[4962]: > Feb 20 10:49:39 crc kubenswrapper[4962]: I0220 10:49:39.795783 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:39 crc kubenswrapper[4962]: I0220 10:49:39.881505 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:40 crc kubenswrapper[4962]: I0220 10:49:40.047897 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:41 crc kubenswrapper[4962]: I0220 10:49:41.748143 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ksv8l" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="registry-server" containerID="cri-o://e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e" gracePeriod=2 Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.239222 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.332183 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content\") pod \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.332227 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4tfx\" (UniqueName: \"kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx\") pod \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.332381 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities\") pod \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.333421 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities" (OuterVolumeSpecName: "utilities") pod "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" (UID: "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.342978 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx" (OuterVolumeSpecName: "kube-api-access-g4tfx") pod "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" (UID: "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c"). InnerVolumeSpecName "kube-api-access-g4tfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.434347 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.434403 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4tfx\" (UniqueName: \"kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx\") on node \"crc\" DevicePath \"\"" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.520430 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" (UID: "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.535734 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.759521 4962 generic.go:334] "Generic (PLEG): container finished" podID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerID="e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e" exitCode=0 Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.759561 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerDied","Data":"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e"} Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.759607 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerDied","Data":"9cdd274a165a9e9b2e89ca49f289b6d7fe464745bb688d5b654d0aaabd0db097"} Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.759628 4962 scope.go:117] "RemoveContainer" containerID="e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.759635 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.788450 4962 scope.go:117] "RemoveContainer" containerID="dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.824308 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.833964 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.844514 4962 scope.go:117] "RemoveContainer" containerID="65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.867453 4962 scope.go:117] "RemoveContainer" containerID="e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e" Feb 20 10:49:42 crc kubenswrapper[4962]: E0220 10:49:42.868116 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e\": container with ID starting with e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e not found: ID does not exist" containerID="e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.868154 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e"} err="failed to get container status \"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e\": rpc error: code = NotFound desc = could not find container \"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e\": container with ID starting with e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e not found: ID does not exist" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.868182 4962 scope.go:117] "RemoveContainer" containerID="dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89" Feb 20 10:49:42 crc kubenswrapper[4962]: E0220 10:49:42.868487 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89\": container with ID starting with dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89 not found: ID does not exist" containerID="dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.868516 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89"} err="failed to get container status \"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89\": rpc error: code = NotFound desc = could not find container \"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89\": container with ID starting with dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89 not found: ID does not exist" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.868538 4962 scope.go:117] "RemoveContainer" containerID="65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f" Feb 20 10:49:42 crc kubenswrapper[4962]: E0220 10:49:42.868814 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f\": container with ID starting with 65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f not found: ID does not exist" containerID="65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.868836 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f"} err="failed to get container status \"65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f\": rpc error: code = NotFound desc = could not find container \"65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f\": container with ID starting with 65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f not found: ID does not exist" Feb 20 10:49:43 crc kubenswrapper[4962]: I0220 10:49:43.154725 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" path="/var/lib/kubelet/pods/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c/volumes" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.113493 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:49:53 crc kubenswrapper[4962]: E0220 10:49:53.114431 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="extract-content" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.114448 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="extract-content" Feb 20 10:49:53 crc kubenswrapper[4962]: E0220 10:49:53.114470 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="registry-server" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.114477 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="registry-server" Feb 20 10:49:53 crc kubenswrapper[4962]: E0220 10:49:53.114493 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="extract-utilities" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.114501 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="extract-utilities" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.114697 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="registry-server" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.115631 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.125933 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.204865 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.204939 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg7zg\" (UniqueName: \"kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.204961 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.306447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.306541 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg7zg\" (UniqueName: \"kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.306579 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.307085 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.307102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.325890 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg7zg\" (UniqueName: \"kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.430062 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: W0220 10:49:53.956589 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f06592_8c43_4e8e_95aa_0d2ec94e7cfb.slice/crio-5e67b557916f3a0911817b3479e3ba7772a2ae016088d45d9a2c8521458a5025 WatchSource:0}: Error finding container 5e67b557916f3a0911817b3479e3ba7772a2ae016088d45d9a2c8521458a5025: Status 404 returned error can't find the container with id 5e67b557916f3a0911817b3479e3ba7772a2ae016088d45d9a2c8521458a5025 Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.958969 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:49:54 crc kubenswrapper[4962]: I0220 10:49:54.886899 4962 generic.go:334] "Generic (PLEG): container finished" podID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerID="d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9" exitCode=0 Feb 20 10:49:54 crc kubenswrapper[4962]: I0220 10:49:54.887179 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerDied","Data":"d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9"} Feb 20 10:49:54 crc kubenswrapper[4962]: I0220 10:49:54.887339 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerStarted","Data":"5e67b557916f3a0911817b3479e3ba7772a2ae016088d45d9a2c8521458a5025"} Feb 20 10:49:56 crc kubenswrapper[4962]: I0220 10:49:56.905748 4962 generic.go:334] "Generic (PLEG): container finished" podID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerID="83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37" exitCode=0 Feb 20 10:49:56 crc kubenswrapper[4962]: I0220 10:49:56.905838 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerDied","Data":"83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37"} Feb 20 10:49:57 crc kubenswrapper[4962]: I0220 10:49:57.919703 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerStarted","Data":"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2"} Feb 20 10:49:57 crc kubenswrapper[4962]: I0220 10:49:57.955227 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hm46k" podStartSLOduration=2.558229145 podStartE2EDuration="4.9552001s" podCreationTimestamp="2026-02-20 10:49:53 +0000 UTC" firstStartedPulling="2026-02-20 10:49:54.891119485 +0000 UTC m=+3286.473591361" lastFinishedPulling="2026-02-20 10:49:57.28809044 +0000 UTC m=+3288.870562316" observedRunningTime="2026-02-20 10:49:57.945009857 +0000 UTC m=+3289.527481733" watchObservedRunningTime="2026-02-20 10:49:57.9552001 +0000 UTC m=+3289.537671986" Feb 20 10:50:03 crc kubenswrapper[4962]: I0220 10:50:03.430353 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:03 crc kubenswrapper[4962]: I0220 10:50:03.432043 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:03 crc kubenswrapper[4962]: I0220 10:50:03.509643 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:04 crc kubenswrapper[4962]: I0220 10:50:04.044121 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:04 crc kubenswrapper[4962]: I0220 10:50:04.108261 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:50:05 crc kubenswrapper[4962]: I0220 10:50:05.990430 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hm46k" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="registry-server" containerID="cri-o://1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2" gracePeriod=2 Feb 20 10:50:06 crc kubenswrapper[4962]: E0220 10:50:06.142684 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f06592_8c43_4e8e_95aa_0d2ec94e7cfb.slice/crio-1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f06592_8c43_4e8e_95aa_0d2ec94e7cfb.slice/crio-conmon-1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2.scope\": RecentStats: unable to find data in memory cache]" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.506567 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.629260 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content\") pod \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.629379 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities\") pod \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.629418 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg7zg\" (UniqueName: \"kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg\") pod \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.630367 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities" (OuterVolumeSpecName: "utilities") pod "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" (UID: "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.641730 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg" (OuterVolumeSpecName: "kube-api-access-tg7zg") pod "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" (UID: "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb"). InnerVolumeSpecName "kube-api-access-tg7zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.730768 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.731106 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg7zg\" (UniqueName: \"kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg\") on node \"crc\" DevicePath \"\"" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.802067 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" (UID: "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.833250 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.003625 4962 generic.go:334] "Generic (PLEG): container finished" podID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerID="1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2" exitCode=0 Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.003691 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerDied","Data":"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2"} Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.003776 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerDied","Data":"5e67b557916f3a0911817b3479e3ba7772a2ae016088d45d9a2c8521458a5025"} Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.003817 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.003827 4962 scope.go:117] "RemoveContainer" containerID="1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.036920 4962 scope.go:117] "RemoveContainer" containerID="83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.054085 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.067455 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.072961 4962 scope.go:117] "RemoveContainer" containerID="d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.095236 4962 scope.go:117] "RemoveContainer" containerID="1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2" Feb 20 10:50:07 crc kubenswrapper[4962]: E0220 10:50:07.095869 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2\": container with ID starting with 1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2 not found: ID does not exist" containerID="1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.095909 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2"} err="failed to get container status \"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2\": rpc error: code = NotFound desc = could not find container \"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2\": container with ID starting with 1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2 not found: ID does not exist" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.095935 4962 scope.go:117] "RemoveContainer" containerID="83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37" Feb 20 10:50:07 crc kubenswrapper[4962]: E0220 10:50:07.096303 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37\": container with ID starting with 83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37 not found: ID does not exist" containerID="83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.096338 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37"} err="failed to get container status \"83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37\": rpc error: code = NotFound desc = could not find container \"83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37\": container with ID starting with 83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37 not found: ID does not exist" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.096367 4962 scope.go:117] "RemoveContainer" containerID="d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9" Feb 20 10:50:07 crc kubenswrapper[4962]: E0220 10:50:07.096695 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9\": container with ID starting with d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9 not found: ID does not exist" containerID="d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.096730 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9"} err="failed to get container status \"d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9\": rpc error: code = NotFound desc = could not find container \"d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9\": container with ID starting with d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9 not found: ID does not exist" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.156438 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" path="/var/lib/kubelet/pods/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb/volumes" Feb 20 10:50:41 crc kubenswrapper[4962]: I0220 10:50:41.508309 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:50:41 crc kubenswrapper[4962]: I0220 10:50:41.509072 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:51:11 crc kubenswrapper[4962]: I0220 10:51:11.508062 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:51:11 crc kubenswrapper[4962]: I0220 10:51:11.510322 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.508687 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.509297 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.509361 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.510242 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.510338 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059" gracePeriod=600 Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.889792 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059" exitCode=0 Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.889868 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059"} Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.890193 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5"} Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.890237 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.669777 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:31 crc kubenswrapper[4962]: E0220 10:53:31.670852 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="extract-utilities" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.670876 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="extract-utilities" Feb 20 10:53:31 crc kubenswrapper[4962]: E0220 10:53:31.670903 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="registry-server" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.670916 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="registry-server" Feb 20 10:53:31 crc kubenswrapper[4962]: E0220 10:53:31.670968 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="extract-content" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.670981 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="extract-content" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.671235 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="registry-server" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.673038 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.683582 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.816745 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.816804 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.817156 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csc5j\" (UniqueName: \"kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.918920 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csc5j\" (UniqueName: \"kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.919009 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.919056 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.920870 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.920902 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.951572 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csc5j\" (UniqueName: \"kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:32 crc kubenswrapper[4962]: I0220 10:53:32.000002 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:32 crc kubenswrapper[4962]: I0220 10:53:32.507380 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:32 crc kubenswrapper[4962]: I0220 10:53:32.951474 4962 generic.go:334] "Generic (PLEG): container finished" podID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerID="b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40" exitCode=0 Feb 20 10:53:32 crc kubenswrapper[4962]: I0220 10:53:32.951524 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerDied","Data":"b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40"} Feb 20 10:53:32 crc kubenswrapper[4962]: I0220 10:53:32.951552 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerStarted","Data":"4abdc1336b017d4b427cff40cec7fac8ae13ddc593821c53d5d721bfac421ae3"} Feb 20 10:53:33 crc kubenswrapper[4962]: I0220 10:53:33.968733 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerStarted","Data":"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87"} Feb 20 10:53:34 crc kubenswrapper[4962]: I0220 10:53:34.981238 4962 generic.go:334] "Generic (PLEG): container finished" podID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerID="d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87" exitCode=0 Feb 20 10:53:34 crc kubenswrapper[4962]: I0220 10:53:34.981305 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerDied","Data":"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87"} Feb 20 10:53:35 crc kubenswrapper[4962]: I0220 10:53:35.989021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerStarted","Data":"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4"} Feb 20 10:53:36 crc kubenswrapper[4962]: I0220 10:53:36.011496 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4twc6" podStartSLOduration=2.603251475 podStartE2EDuration="5.011471871s" podCreationTimestamp="2026-02-20 10:53:31 +0000 UTC" firstStartedPulling="2026-02-20 10:53:32.953103628 +0000 UTC m=+3504.535575474" lastFinishedPulling="2026-02-20 10:53:35.361323984 +0000 UTC m=+3506.943795870" observedRunningTime="2026-02-20 10:53:36.007464831 +0000 UTC m=+3507.589936677" watchObservedRunningTime="2026-02-20 10:53:36.011471871 +0000 UTC m=+3507.593943757" Feb 20 10:53:41 crc kubenswrapper[4962]: I0220 10:53:41.508807 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:53:41 crc kubenswrapper[4962]: I0220 10:53:41.509662 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:53:42 crc kubenswrapper[4962]: I0220 10:53:42.000857 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:42 crc kubenswrapper[4962]: I0220 10:53:42.000918 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:42 crc kubenswrapper[4962]: I0220 10:53:42.079453 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:42 crc kubenswrapper[4962]: I0220 10:53:42.154695 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:42 crc kubenswrapper[4962]: I0220 10:53:42.331437 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.053212 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4twc6" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="registry-server" containerID="cri-o://0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4" gracePeriod=2 Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.640637 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.749955 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csc5j\" (UniqueName: \"kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j\") pod \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.750094 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content\") pod \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.750136 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities\") pod \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.751178 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities" (OuterVolumeSpecName: "utilities") pod "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" (UID: "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.758447 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j" (OuterVolumeSpecName: "kube-api-access-csc5j") pod "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" (UID: "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f"). InnerVolumeSpecName "kube-api-access-csc5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.805299 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" (UID: "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.850926 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csc5j\" (UniqueName: \"kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j\") on node \"crc\" DevicePath \"\"" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.850957 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.850967 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.065511 4962 generic.go:334] "Generic (PLEG): container finished" podID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerID="0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4" exitCode=0 Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.066026 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerDied","Data":"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4"} Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.066067 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerDied","Data":"4abdc1336b017d4b427cff40cec7fac8ae13ddc593821c53d5d721bfac421ae3"} Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.066097 4962 scope.go:117] "RemoveContainer" containerID="0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.066261 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.105066 4962 scope.go:117] "RemoveContainer" containerID="d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.127170 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.154175 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.156335 4962 scope.go:117] "RemoveContainer" containerID="b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.187094 4962 scope.go:117] "RemoveContainer" containerID="0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4" Feb 20 10:53:45 crc kubenswrapper[4962]: E0220 10:53:45.188082 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4\": container with ID starting with 0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4 not found: ID does not exist" containerID="0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.188132 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4"} err="failed to get container status \"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4\": rpc error: code = NotFound desc = could not find container \"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4\": container with ID starting with 0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4 not found: ID does not exist" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.188161 4962 scope.go:117] "RemoveContainer" containerID="d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87" Feb 20 10:53:45 crc kubenswrapper[4962]: E0220 10:53:45.188732 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87\": container with ID starting with d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87 not found: ID does not exist" containerID="d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.188761 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87"} err="failed to get container status \"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87\": rpc error: code = NotFound desc = could not find container \"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87\": container with ID starting with d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87 not found: ID does not exist" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.188773 4962 scope.go:117] "RemoveContainer" containerID="b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40" Feb 20 10:53:45 crc kubenswrapper[4962]: E0220 10:53:45.189087 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40\": container with ID starting with b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40 not found: ID does not exist" containerID="b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.189139 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40"} err="failed to get container status \"b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40\": rpc error: code = NotFound desc = could not find container \"b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40\": container with ID starting with b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40 not found: ID does not exist" Feb 20 10:53:47 crc kubenswrapper[4962]: I0220 10:53:47.155170 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" path="/var/lib/kubelet/pods/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f/volumes" Feb 20 10:54:11 crc kubenswrapper[4962]: I0220 10:54:11.508517 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:54:11 crc kubenswrapper[4962]: I0220 10:54:11.509207 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:54:41 crc kubenswrapper[4962]: I0220 10:54:41.508721 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:54:41 crc kubenswrapper[4962]: I0220 10:54:41.511563 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:54:41 crc kubenswrapper[4962]: I0220 10:54:41.512075 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:54:41 crc kubenswrapper[4962]: I0220 10:54:41.513055 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:54:41 crc kubenswrapper[4962]: I0220 10:54:41.513314 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" gracePeriod=600 Feb 20 10:54:41 crc kubenswrapper[4962]: E0220 10:54:41.649379 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:54:42 crc kubenswrapper[4962]: I0220 10:54:42.594959 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" exitCode=0 Feb 20 10:54:42 crc kubenswrapper[4962]: I0220 10:54:42.595042 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5"} Feb 20 10:54:42 crc kubenswrapper[4962]: I0220 10:54:42.595100 4962 scope.go:117] "RemoveContainer" containerID="96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059" Feb 20 10:54:42 crc kubenswrapper[4962]: I0220 10:54:42.595797 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:54:42 crc kubenswrapper[4962]: E0220 10:54:42.596389 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:54:54 crc kubenswrapper[4962]: I0220 10:54:54.139031 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:54:54 crc kubenswrapper[4962]: E0220 10:54:54.140238 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:55:07 crc kubenswrapper[4962]: I0220 10:55:07.139527 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:55:07 crc kubenswrapper[4962]: E0220 10:55:07.140575 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:55:19 crc kubenswrapper[4962]: I0220 10:55:19.167220 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:55:19 crc kubenswrapper[4962]: E0220 10:55:19.168446 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:55:30 crc kubenswrapper[4962]: I0220 10:55:30.139268 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:55:30 crc kubenswrapper[4962]: E0220 10:55:30.140307 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:55:41 crc kubenswrapper[4962]: I0220 10:55:41.139315 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:55:41 crc kubenswrapper[4962]: E0220 10:55:41.140322 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:55:56 crc kubenswrapper[4962]: I0220 10:55:56.139862 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:55:56 crc kubenswrapper[4962]: E0220 10:55:56.141042 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:56:09 crc kubenswrapper[4962]: I0220 10:56:09.146383 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:56:09 crc kubenswrapper[4962]: E0220 10:56:09.147354 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:56:22 crc kubenswrapper[4962]: I0220 10:56:22.139423 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:56:22 crc kubenswrapper[4962]: E0220 10:56:22.140708 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:56:34 crc kubenswrapper[4962]: I0220 10:56:34.138864 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:56:34 crc kubenswrapper[4962]: E0220 10:56:34.139687 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:56:47 crc kubenswrapper[4962]: I0220 10:56:47.139007 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:56:47 crc kubenswrapper[4962]: E0220 10:56:47.140098 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:56:58 crc kubenswrapper[4962]: I0220 10:56:58.138740 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:56:58 crc kubenswrapper[4962]: E0220 10:56:58.139731 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:57:12 crc kubenswrapper[4962]: I0220 10:57:12.139187 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:57:12 crc kubenswrapper[4962]: E0220 10:57:12.140326 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.655018 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:23 crc kubenswrapper[4962]: E0220 10:57:23.655984 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="extract-content" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.656006 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="extract-content" Feb 20 10:57:23 crc kubenswrapper[4962]: E0220 10:57:23.656039 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="extract-utilities" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.656051 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="extract-utilities" Feb 20 10:57:23 crc kubenswrapper[4962]: E0220 10:57:23.656086 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="registry-server" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.656099 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="registry-server" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.656334 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="registry-server" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.657983 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.673136 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.804336 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.804449 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.804484 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf2f8\" (UniqueName: \"kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.906261 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.906372 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.906415 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf2f8\" (UniqueName: \"kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.906960 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.906980 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.941575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf2f8\" (UniqueName: \"kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:24 crc kubenswrapper[4962]: I0220 10:57:24.002210 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:24 crc kubenswrapper[4962]: I0220 10:57:24.501775 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:25 crc kubenswrapper[4962]: I0220 10:57:25.074183 4962 generic.go:334] "Generic (PLEG): container finished" podID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerID="69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8" exitCode=0 Feb 20 10:57:25 crc kubenswrapper[4962]: I0220 10:57:25.074297 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerDied","Data":"69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8"} Feb 20 10:57:25 crc kubenswrapper[4962]: I0220 10:57:25.074786 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerStarted","Data":"03fd792e4282a240ffeedaca88c255b25b9b927c2ad0f536473a6f08061f5e6e"} Feb 20 10:57:25 crc kubenswrapper[4962]: I0220 10:57:25.077055 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:57:25 crc kubenswrapper[4962]: I0220 10:57:25.144167 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:57:25 crc kubenswrapper[4962]: E0220 10:57:25.144560 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:57:26 crc kubenswrapper[4962]: I0220 10:57:26.088926 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerStarted","Data":"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b"} Feb 20 10:57:27 crc kubenswrapper[4962]: I0220 10:57:27.102476 4962 generic.go:334] "Generic (PLEG): container finished" podID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerID="46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b" exitCode=0 Feb 20 10:57:27 crc kubenswrapper[4962]: I0220 10:57:27.102554 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerDied","Data":"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b"} Feb 20 10:57:28 crc kubenswrapper[4962]: I0220 10:57:28.115979 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerStarted","Data":"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a"} Feb 20 10:57:28 crc kubenswrapper[4962]: I0220 10:57:28.147167 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f5f7g" podStartSLOduration=2.714216917 podStartE2EDuration="5.147141359s" podCreationTimestamp="2026-02-20 10:57:23 +0000 UTC" firstStartedPulling="2026-02-20 10:57:25.076391545 +0000 UTC m=+3736.658863431" lastFinishedPulling="2026-02-20 10:57:27.509315987 +0000 UTC m=+3739.091787873" observedRunningTime="2026-02-20 10:57:28.143408864 +0000 UTC m=+3739.725880740" watchObservedRunningTime="2026-02-20 10:57:28.147141359 +0000 UTC m=+3739.729613235" Feb 20 10:57:34 crc kubenswrapper[4962]: I0220 10:57:34.002729 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:34 crc kubenswrapper[4962]: I0220 10:57:34.003451 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:34 crc kubenswrapper[4962]: I0220 10:57:34.083677 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:34 crc kubenswrapper[4962]: I0220 10:57:34.217626 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:37 crc kubenswrapper[4962]: I0220 10:57:37.629798 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:37 crc kubenswrapper[4962]: I0220 10:57:37.630433 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f5f7g" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="registry-server" containerID="cri-o://a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a" gracePeriod=2 Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.029876 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.135337 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf2f8\" (UniqueName: \"kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8\") pod \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.135480 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content\") pod \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.135517 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities\") pod \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.136414 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities" (OuterVolumeSpecName: "utilities") pod "bfdfb336-dce1-418b-9c2a-81c68335f4bf" (UID: "bfdfb336-dce1-418b-9c2a-81c68335f4bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.138915 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:57:38 crc kubenswrapper[4962]: E0220 10:57:38.139173 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.141921 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8" (OuterVolumeSpecName: "kube-api-access-vf2f8") pod "bfdfb336-dce1-418b-9c2a-81c68335f4bf" (UID: "bfdfb336-dce1-418b-9c2a-81c68335f4bf"). InnerVolumeSpecName "kube-api-access-vf2f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.165095 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfdfb336-dce1-418b-9c2a-81c68335f4bf" (UID: "bfdfb336-dce1-418b-9c2a-81c68335f4bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.189058 4962 generic.go:334] "Generic (PLEG): container finished" podID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerID="a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a" exitCode=0 Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.189118 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerDied","Data":"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a"} Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.189158 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerDied","Data":"03fd792e4282a240ffeedaca88c255b25b9b927c2ad0f536473a6f08061f5e6e"} Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.189184 4962 scope.go:117] "RemoveContainer" containerID="a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.189363 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.218383 4962 scope.go:117] "RemoveContainer" containerID="46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.235859 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.237571 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf2f8\" (UniqueName: \"kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8\") on node \"crc\" DevicePath \"\"" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.237640 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.237658 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.244731 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.272977 4962 scope.go:117] "RemoveContainer" containerID="69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.308607 4962 scope.go:117] "RemoveContainer" containerID="a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a" Feb 20 10:57:38 crc kubenswrapper[4962]: E0220 10:57:38.309339 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a\": container with ID starting with a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a not found: ID does not exist" containerID="a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.309387 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a"} err="failed to get container status \"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a\": rpc error: code = NotFound desc = could not find container \"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a\": container with ID starting with a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a not found: ID does not exist" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.309420 4962 scope.go:117] "RemoveContainer" containerID="46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b" Feb 20 10:57:38 crc kubenswrapper[4962]: E0220 10:57:38.310448 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b\": container with ID starting with 46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b not found: ID does not exist" containerID="46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.310511 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b"} err="failed to get container status \"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b\": rpc error: code = NotFound desc = could not find container \"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b\": container with ID starting with 46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b not found: ID does not exist" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.310545 4962 scope.go:117] "RemoveContainer" containerID="69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8" Feb 20 10:57:38 crc kubenswrapper[4962]: E0220 10:57:38.311959 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8\": container with ID starting with 69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8 not found: ID does not exist" containerID="69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.312100 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8"} err="failed to get container status \"69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8\": rpc error: code = NotFound desc = could not find container \"69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8\": container with ID starting with 69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8 not found: ID does not exist" Feb 20 10:57:39 crc kubenswrapper[4962]: I0220 10:57:39.155282 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" path="/var/lib/kubelet/pods/bfdfb336-dce1-418b-9c2a-81c68335f4bf/volumes" Feb 20 10:57:53 crc kubenswrapper[4962]: I0220 10:57:53.138917 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:57:53 crc kubenswrapper[4962]: E0220 10:57:53.139807 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:58:07 crc kubenswrapper[4962]: I0220 10:58:07.139466 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:58:07 crc kubenswrapper[4962]: E0220 10:58:07.140566 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:58:20 crc kubenswrapper[4962]: I0220 10:58:20.167564 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:58:20 crc kubenswrapper[4962]: E0220 10:58:20.168912 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:58:32 crc kubenswrapper[4962]: I0220 10:58:32.138844 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:58:32 crc kubenswrapper[4962]: E0220 10:58:32.139818 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:58:46 crc kubenswrapper[4962]: I0220 10:58:46.139445 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:58:46 crc kubenswrapper[4962]: E0220 10:58:46.140403 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:59:01 crc kubenswrapper[4962]: I0220 10:59:01.139568 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:59:01 crc kubenswrapper[4962]: E0220 10:59:01.140441 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:59:14 crc kubenswrapper[4962]: I0220 10:59:14.139347 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:59:14 crc kubenswrapper[4962]: E0220 10:59:14.140998 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:59:28 crc kubenswrapper[4962]: I0220 10:59:28.138927 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:59:28 crc kubenswrapper[4962]: E0220 10:59:28.139986 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.952568 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 10:59:41 crc kubenswrapper[4962]: E0220 10:59:41.953900 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="extract-content" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.953927 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="extract-content" Feb 20 10:59:41 crc kubenswrapper[4962]: E0220 10:59:41.953952 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="registry-server" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.953968 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="registry-server" Feb 20 10:59:41 crc kubenswrapper[4962]: E0220 10:59:41.954020 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="extract-utilities" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.954039 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="extract-utilities" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.954341 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="registry-server" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.955980 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.968987 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.085849 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.085930 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkzch\" (UniqueName: \"kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.086124 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.187178 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.187235 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkzch\" (UniqueName: \"kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.187272 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.187859 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.187855 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.208130 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkzch\" (UniqueName: \"kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.285374 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.719672 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 10:59:43 crc kubenswrapper[4962]: I0220 10:59:43.138955 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:59:43 crc kubenswrapper[4962]: I0220 10:59:43.318392 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b"} Feb 20 10:59:43 crc kubenswrapper[4962]: I0220 10:59:43.319799 4962 generic.go:334] "Generic (PLEG): container finished" podID="84d46def-f006-4801-a633-a88796c6dc6b" containerID="bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966" exitCode=0 Feb 20 10:59:43 crc kubenswrapper[4962]: I0220 10:59:43.319845 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerDied","Data":"bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966"} Feb 20 10:59:43 crc kubenswrapper[4962]: I0220 10:59:43.319875 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerStarted","Data":"d786add5122e41426cb69b4c43033a6b7bc34c4cb08c9a68988cdfc0850b6c3b"} Feb 20 10:59:45 crc kubenswrapper[4962]: I0220 10:59:45.341518 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerStarted","Data":"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa"} Feb 20 10:59:46 crc kubenswrapper[4962]: I0220 10:59:46.353549 4962 generic.go:334] "Generic (PLEG): container finished" podID="84d46def-f006-4801-a633-a88796c6dc6b" containerID="189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa" exitCode=0 Feb 20 10:59:46 crc kubenswrapper[4962]: I0220 10:59:46.353681 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerDied","Data":"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa"} Feb 20 10:59:47 crc kubenswrapper[4962]: I0220 10:59:47.370177 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerStarted","Data":"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12"} Feb 20 10:59:47 crc kubenswrapper[4962]: I0220 10:59:47.408177 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cq2nm" podStartSLOduration=2.976510471 podStartE2EDuration="6.408150966s" podCreationTimestamp="2026-02-20 10:59:41 +0000 UTC" firstStartedPulling="2026-02-20 10:59:43.321072129 +0000 UTC m=+3874.903543985" lastFinishedPulling="2026-02-20 10:59:46.752712594 +0000 UTC m=+3878.335184480" observedRunningTime="2026-02-20 10:59:47.391859966 +0000 UTC m=+3878.974331842" watchObservedRunningTime="2026-02-20 10:59:47.408150966 +0000 UTC m=+3878.990622852" Feb 20 10:59:52 crc kubenswrapper[4962]: I0220 10:59:52.285655 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:52 crc kubenswrapper[4962]: I0220 10:59:52.286364 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.344187 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cq2nm" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="registry-server" probeResult="failure" output=< Feb 20 10:59:53 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 10:59:53 crc kubenswrapper[4962]: > Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.395307 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.397799 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.419171 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.563809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvq4q\" (UniqueName: \"kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.563887 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.563910 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.665178 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvq4q\" (UniqueName: \"kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.665281 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.665315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.665732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.665860 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.692201 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvq4q\" (UniqueName: \"kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.720524 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:54 crc kubenswrapper[4962]: I0220 10:59:54.413720 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 10:59:54 crc kubenswrapper[4962]: W0220 10:59:54.419566 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e53575b_37ab_4c46_be3f_6ac873a2a9d0.slice/crio-bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff WatchSource:0}: Error finding container bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff: Status 404 returned error can't find the container with id bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff Feb 20 10:59:54 crc kubenswrapper[4962]: I0220 10:59:54.442295 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerStarted","Data":"bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff"} Feb 20 10:59:55 crc kubenswrapper[4962]: I0220 10:59:55.454587 4962 generic.go:334] "Generic (PLEG): container finished" podID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerID="9c0af593b4b43f8781c7a5aba922acad47038b25f5521623f3ea1762a03d3532" exitCode=0 Feb 20 10:59:55 crc kubenswrapper[4962]: I0220 10:59:55.454762 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerDied","Data":"9c0af593b4b43f8781c7a5aba922acad47038b25f5521623f3ea1762a03d3532"} Feb 20 10:59:56 crc kubenswrapper[4962]: I0220 10:59:56.467905 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerStarted","Data":"02a91fcfd5ea9a6b32532b3d0e1f6e6d142e678ce53b20f778f4d99dcb76e30e"} Feb 20 10:59:57 crc kubenswrapper[4962]: I0220 10:59:57.480335 4962 generic.go:334] "Generic (PLEG): container finished" podID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerID="02a91fcfd5ea9a6b32532b3d0e1f6e6d142e678ce53b20f778f4d99dcb76e30e" exitCode=0 Feb 20 10:59:57 crc kubenswrapper[4962]: I0220 10:59:57.480486 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerDied","Data":"02a91fcfd5ea9a6b32532b3d0e1f6e6d142e678ce53b20f778f4d99dcb76e30e"} Feb 20 10:59:58 crc kubenswrapper[4962]: I0220 10:59:58.492137 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerStarted","Data":"1f35394b405a7932de6ea65a40b90b4cfc16077b45269a93e08d860ad8ab92d6"} Feb 20 10:59:58 crc kubenswrapper[4962]: I0220 10:59:58.523723 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mr5wb" podStartSLOduration=2.972382537 podStartE2EDuration="5.52370061s" podCreationTimestamp="2026-02-20 10:59:53 +0000 UTC" firstStartedPulling="2026-02-20 10:59:55.459517748 +0000 UTC m=+3887.041989634" lastFinishedPulling="2026-02-20 10:59:58.010835831 +0000 UTC m=+3889.593307707" observedRunningTime="2026-02-20 10:59:58.518058398 +0000 UTC m=+3890.100530294" watchObservedRunningTime="2026-02-20 10:59:58.52370061 +0000 UTC m=+3890.106172486" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.212427 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd"] Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.214779 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.216932 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.217588 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.224134 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd"] Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.370829 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.371217 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb2cc\" (UniqueName: \"kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.371454 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.473044 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.473144 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.473182 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb2cc\" (UniqueName: \"kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.474931 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.482782 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.505509 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb2cc\" (UniqueName: \"kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.540703 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.799726 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd"] Feb 20 11:00:01 crc kubenswrapper[4962]: I0220 11:00:01.522524 4962 generic.go:334] "Generic (PLEG): container finished" podID="444fe6ec-91a1-4572-9cc7-59f9848bd957" containerID="0417c10167fec51451355a4b52c16cd2e8025894a6838915d3bce249a3562e11" exitCode=0 Feb 20 11:00:01 crc kubenswrapper[4962]: I0220 11:00:01.522649 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" event={"ID":"444fe6ec-91a1-4572-9cc7-59f9848bd957","Type":"ContainerDied","Data":"0417c10167fec51451355a4b52c16cd2e8025894a6838915d3bce249a3562e11"} Feb 20 11:00:01 crc kubenswrapper[4962]: I0220 11:00:01.523021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" event={"ID":"444fe6ec-91a1-4572-9cc7-59f9848bd957","Type":"ContainerStarted","Data":"9d2297082c21d3fe2a03b7b950f9cb27dd771fe868117807120a4e92bf8369ec"} Feb 20 11:00:02 crc kubenswrapper[4962]: I0220 11:00:02.363631 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 11:00:02 crc kubenswrapper[4962]: I0220 11:00:02.441909 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 11:00:02 crc kubenswrapper[4962]: I0220 11:00:02.621748 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 11:00:02 crc kubenswrapper[4962]: I0220 11:00:02.873133 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.015073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume\") pod \"444fe6ec-91a1-4572-9cc7-59f9848bd957\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.015184 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume\") pod \"444fe6ec-91a1-4572-9cc7-59f9848bd957\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.015522 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb2cc\" (UniqueName: \"kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc\") pod \"444fe6ec-91a1-4572-9cc7-59f9848bd957\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.016252 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume" (OuterVolumeSpecName: "config-volume") pod "444fe6ec-91a1-4572-9cc7-59f9848bd957" (UID: "444fe6ec-91a1-4572-9cc7-59f9848bd957"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.022622 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc" (OuterVolumeSpecName: "kube-api-access-xb2cc") pod "444fe6ec-91a1-4572-9cc7-59f9848bd957" (UID: "444fe6ec-91a1-4572-9cc7-59f9848bd957"). InnerVolumeSpecName "kube-api-access-xb2cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.022852 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "444fe6ec-91a1-4572-9cc7-59f9848bd957" (UID: "444fe6ec-91a1-4572-9cc7-59f9848bd957"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.117739 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.117781 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb2cc\" (UniqueName: \"kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.117797 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.545747 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" event={"ID":"444fe6ec-91a1-4572-9cc7-59f9848bd957","Type":"ContainerDied","Data":"9d2297082c21d3fe2a03b7b950f9cb27dd771fe868117807120a4e92bf8369ec"} Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.545822 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d2297082c21d3fe2a03b7b950f9cb27dd771fe868117807120a4e92bf8369ec" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.545858 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cq2nm" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="registry-server" containerID="cri-o://186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12" gracePeriod=2 Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.545957 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.721443 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.721523 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.788518 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.965344 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc"] Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.972033 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc"] Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.019843 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.143851 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities\") pod \"84d46def-f006-4801-a633-a88796c6dc6b\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.145117 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities" (OuterVolumeSpecName: "utilities") pod "84d46def-f006-4801-a633-a88796c6dc6b" (UID: "84d46def-f006-4801-a633-a88796c6dc6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.145350 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkzch\" (UniqueName: \"kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch\") pod \"84d46def-f006-4801-a633-a88796c6dc6b\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.145420 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content\") pod \"84d46def-f006-4801-a633-a88796c6dc6b\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.150284 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.150852 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch" (OuterVolumeSpecName: "kube-api-access-pkzch") pod "84d46def-f006-4801-a633-a88796c6dc6b" (UID: "84d46def-f006-4801-a633-a88796c6dc6b"). InnerVolumeSpecName "kube-api-access-pkzch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.251751 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkzch\" (UniqueName: \"kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.364152 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84d46def-f006-4801-a633-a88796c6dc6b" (UID: "84d46def-f006-4801-a633-a88796c6dc6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.454818 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.562783 4962 generic.go:334] "Generic (PLEG): container finished" podID="84d46def-f006-4801-a633-a88796c6dc6b" containerID="186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12" exitCode=0 Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.562863 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerDied","Data":"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12"} Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.562946 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerDied","Data":"d786add5122e41426cb69b4c43033a6b7bc34c4cb08c9a68988cdfc0850b6c3b"} Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.562979 4962 scope.go:117] "RemoveContainer" containerID="186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.562888 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.592882 4962 scope.go:117] "RemoveContainer" containerID="189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.623477 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.633186 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.648355 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.664250 4962 scope.go:117] "RemoveContainer" containerID="bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.697655 4962 scope.go:117] "RemoveContainer" containerID="186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12" Feb 20 11:00:04 crc kubenswrapper[4962]: E0220 11:00:04.698312 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12\": container with ID starting with 186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12 not found: ID does not exist" containerID="186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.698389 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12"} err="failed to get container status \"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12\": rpc error: code = NotFound desc = could not find container \"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12\": container with ID starting with 186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12 not found: ID does not exist" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.698433 4962 scope.go:117] "RemoveContainer" containerID="189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa" Feb 20 11:00:04 crc kubenswrapper[4962]: E0220 11:00:04.698890 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa\": container with ID starting with 189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa not found: ID does not exist" containerID="189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.698933 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa"} err="failed to get container status \"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa\": rpc error: code = NotFound desc = could not find container \"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa\": container with ID starting with 189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa not found: ID does not exist" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.698988 4962 scope.go:117] "RemoveContainer" containerID="bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966" Feb 20 11:00:04 crc kubenswrapper[4962]: E0220 11:00:04.699442 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966\": container with ID starting with bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966 not found: ID does not exist" containerID="bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.699492 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966"} err="failed to get container status \"bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966\": rpc error: code = NotFound desc = could not find container \"bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966\": container with ID starting with bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966 not found: ID does not exist" Feb 20 11:00:05 crc kubenswrapper[4962]: I0220 11:00:05.156857 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d46def-f006-4801-a633-a88796c6dc6b" path="/var/lib/kubelet/pods/84d46def-f006-4801-a633-a88796c6dc6b/volumes" Feb 20 11:00:05 crc kubenswrapper[4962]: I0220 11:00:05.159142 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9c6a80-7747-461e-8f29-f371984a8c95" path="/var/lib/kubelet/pods/fc9c6a80-7747-461e-8f29-f371984a8c95/volumes" Feb 20 11:00:05 crc kubenswrapper[4962]: I0220 11:00:05.822260 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 11:00:06 crc kubenswrapper[4962]: I0220 11:00:06.587340 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mr5wb" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="registry-server" containerID="cri-o://1f35394b405a7932de6ea65a40b90b4cfc16077b45269a93e08d860ad8ab92d6" gracePeriod=2 Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.603847 4962 generic.go:334] "Generic (PLEG): container finished" podID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerID="1f35394b405a7932de6ea65a40b90b4cfc16077b45269a93e08d860ad8ab92d6" exitCode=0 Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.603949 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerDied","Data":"1f35394b405a7932de6ea65a40b90b4cfc16077b45269a93e08d860ad8ab92d6"} Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.604256 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerDied","Data":"bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff"} Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.604282 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.617941 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.626951 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities\") pod \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.627039 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvq4q\" (UniqueName: \"kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q\") pod \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.627075 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content\") pod \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.629273 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities" (OuterVolumeSpecName: "utilities") pod "4e53575b-37ab-4c46-be3f-6ac873a2a9d0" (UID: "4e53575b-37ab-4c46-be3f-6ac873a2a9d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.635737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q" (OuterVolumeSpecName: "kube-api-access-nvq4q") pod "4e53575b-37ab-4c46-be3f-6ac873a2a9d0" (UID: "4e53575b-37ab-4c46-be3f-6ac873a2a9d0"). InnerVolumeSpecName "kube-api-access-nvq4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.723951 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e53575b-37ab-4c46-be3f-6ac873a2a9d0" (UID: "4e53575b-37ab-4c46-be3f-6ac873a2a9d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.729347 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.729423 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvq4q\" (UniqueName: \"kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.729465 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:08 crc kubenswrapper[4962]: I0220 11:00:08.612098 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:08 crc kubenswrapper[4962]: I0220 11:00:08.664053 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 11:00:08 crc kubenswrapper[4962]: I0220 11:00:08.692175 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 11:00:09 crc kubenswrapper[4962]: I0220 11:00:09.162504 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" path="/var/lib/kubelet/pods/4e53575b-37ab-4c46-be3f-6ac873a2a9d0/volumes" Feb 20 11:00:16 crc kubenswrapper[4962]: I0220 11:00:16.681438 4962 scope.go:117] "RemoveContainer" containerID="c54639681debdeffda54130d89e4883eb7658c42414168fa95eba4479a2f093f" Feb 20 11:02:11 crc kubenswrapper[4962]: I0220 11:02:11.508285 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:02:11 crc kubenswrapper[4962]: I0220 11:02:11.508995 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:02:41 crc kubenswrapper[4962]: I0220 11:02:41.507984 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:02:41 crc kubenswrapper[4962]: I0220 11:02:41.508667 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:03:11 crc kubenswrapper[4962]: I0220 11:03:11.508288 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:03:11 crc kubenswrapper[4962]: I0220 11:03:11.509038 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:03:11 crc kubenswrapper[4962]: I0220 11:03:11.509112 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 11:03:11 crc kubenswrapper[4962]: I0220 11:03:11.510018 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 11:03:11 crc kubenswrapper[4962]: I0220 11:03:11.510144 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b" gracePeriod=600 Feb 20 11:03:12 crc kubenswrapper[4962]: I0220 11:03:12.299559 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b" exitCode=0 Feb 20 11:03:12 crc kubenswrapper[4962]: I0220 11:03:12.299693 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b"} Feb 20 11:03:12 crc kubenswrapper[4962]: I0220 11:03:12.299991 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b"} Feb 20 11:03:12 crc kubenswrapper[4962]: I0220 11:03:12.300016 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 11:05:11 crc kubenswrapper[4962]: I0220 11:05:11.507891 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:05:11 crc kubenswrapper[4962]: I0220 11:05:11.508880 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.383729 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tzxjg"] Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385154 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385186 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385212 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="extract-utilities" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385229 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="extract-utilities" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385417 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="extract-content" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385435 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="extract-content" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385467 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385482 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385509 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444fe6ec-91a1-4572-9cc7-59f9848bd957" containerName="collect-profiles" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385525 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="444fe6ec-91a1-4572-9cc7-59f9848bd957" containerName="collect-profiles" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385553 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="extract-utilities" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385569 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="extract-utilities" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385639 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="extract-content" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385656 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="extract-content" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385981 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="444fe6ec-91a1-4572-9cc7-59f9848bd957" containerName="collect-profiles" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.386027 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.386052 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.388342 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.403941 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzxjg"] Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.484925 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4mjm\" (UniqueName: \"kubernetes.io/projected/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-kube-api-access-p4mjm\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.485001 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-catalog-content\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.485073 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-utilities\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.586973 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4mjm\" (UniqueName: \"kubernetes.io/projected/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-kube-api-access-p4mjm\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.587029 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-catalog-content\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.587057 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-utilities\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.587719 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-utilities\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.587799 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-catalog-content\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.609910 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4mjm\" (UniqueName: \"kubernetes.io/projected/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-kube-api-access-p4mjm\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.720455 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:32 crc kubenswrapper[4962]: I0220 11:05:32.234628 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzxjg"] Feb 20 11:05:32 crc kubenswrapper[4962]: I0220 11:05:32.562220 4962 generic.go:334] "Generic (PLEG): container finished" podID="f21d4aaf-2f5d-4576-a1e1-b8c233e285f1" containerID="dcb86e3eec5ec0c3d8c242ccea873e40fc4246d7d95f6d3de2d2989d0bc5d1d9" exitCode=0 Feb 20 11:05:32 crc kubenswrapper[4962]: I0220 11:05:32.562313 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxjg" event={"ID":"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1","Type":"ContainerDied","Data":"dcb86e3eec5ec0c3d8c242ccea873e40fc4246d7d95f6d3de2d2989d0bc5d1d9"} Feb 20 11:05:32 crc kubenswrapper[4962]: I0220 11:05:32.562422 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxjg" event={"ID":"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1","Type":"ContainerStarted","Data":"a2e4d3ebe9ffa77afb7c06a23d865e75f7c83845a11917eeb9f33f455fc0e38b"} Feb 20 11:05:32 crc kubenswrapper[4962]: I0220 11:05:32.564661 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 11:05:36 crc kubenswrapper[4962]: I0220 11:05:36.596908 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxjg" event={"ID":"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1","Type":"ContainerStarted","Data":"52b89b420f7ad1845f0c6551a2469ff5f038736c139b94a10b318273d8f78738"} Feb 20 11:05:37 crc kubenswrapper[4962]: I0220 11:05:37.608990 4962 generic.go:334] "Generic (PLEG): container finished" podID="f21d4aaf-2f5d-4576-a1e1-b8c233e285f1" containerID="52b89b420f7ad1845f0c6551a2469ff5f038736c139b94a10b318273d8f78738" exitCode=0 Feb 20 11:05:37 crc kubenswrapper[4962]: I0220 11:05:37.609117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxjg" event={"ID":"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1","Type":"ContainerDied","Data":"52b89b420f7ad1845f0c6551a2469ff5f038736c139b94a10b318273d8f78738"} Feb 20 11:05:38 crc kubenswrapper[4962]: I0220 11:05:38.621436 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxjg" event={"ID":"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1","Type":"ContainerStarted","Data":"9d52fba61158f0e1cb7806757e0716b4eac9df35c05b2133602134586966576a"} Feb 20 11:05:38 crc kubenswrapper[4962]: I0220 11:05:38.649131 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tzxjg" podStartSLOduration=2.172155068 podStartE2EDuration="7.649110077s" podCreationTimestamp="2026-02-20 11:05:31 +0000 UTC" firstStartedPulling="2026-02-20 11:05:32.564208867 +0000 UTC m=+4224.146680753" lastFinishedPulling="2026-02-20 11:05:38.041163876 +0000 UTC m=+4229.623635762" observedRunningTime="2026-02-20 11:05:38.647433566 +0000 UTC m=+4230.229905432" watchObservedRunningTime="2026-02-20 11:05:38.649110077 +0000 UTC m=+4230.231581943" Feb 20 11:05:41 crc kubenswrapper[4962]: I0220 11:05:41.507717 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:05:41 crc kubenswrapper[4962]: I0220 11:05:41.508668 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:05:41 crc kubenswrapper[4962]: I0220 11:05:41.721051 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:41 crc kubenswrapper[4962]: I0220 11:05:41.721120 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:41 crc kubenswrapper[4962]: I0220 11:05:41.790332 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:51 crc kubenswrapper[4962]: I0220 11:05:51.790540 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:51 crc kubenswrapper[4962]: I0220 11:05:51.904498 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzxjg"] Feb 20 11:05:51 crc kubenswrapper[4962]: I0220 11:05:51.966707 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 11:05:51 crc kubenswrapper[4962]: I0220 11:05:51.967084 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v7zlm" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="registry-server" containerID="cri-o://89e68b25becf346a76d704d66f2fa088754a410ce947cc372fb175cd6ff921ab" gracePeriod=2 Feb 20 11:05:52 crc kubenswrapper[4962]: I0220 11:05:52.758037 4962 generic.go:334] "Generic (PLEG): container finished" podID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerID="89e68b25becf346a76d704d66f2fa088754a410ce947cc372fb175cd6ff921ab" exitCode=0 Feb 20 11:05:52 crc kubenswrapper[4962]: I0220 11:05:52.758729 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerDied","Data":"89e68b25becf346a76d704d66f2fa088754a410ce947cc372fb175cd6ff921ab"} Feb 20 11:05:52 crc kubenswrapper[4962]: I0220 11:05:52.859428 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.059911 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66gmr\" (UniqueName: \"kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr\") pod \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.059972 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities\") pod \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.060040 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content\") pod \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.060606 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities" (OuterVolumeSpecName: "utilities") pod "a63e8904-d4b9-405f-94a1-f44cb565b3e7" (UID: "a63e8904-d4b9-405f-94a1-f44cb565b3e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.066918 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr" (OuterVolumeSpecName: "kube-api-access-66gmr") pod "a63e8904-d4b9-405f-94a1-f44cb565b3e7" (UID: "a63e8904-d4b9-405f-94a1-f44cb565b3e7"). InnerVolumeSpecName "kube-api-access-66gmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.102926 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a63e8904-d4b9-405f-94a1-f44cb565b3e7" (UID: "a63e8904-d4b9-405f-94a1-f44cb565b3e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.161828 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66gmr\" (UniqueName: \"kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr\") on node \"crc\" DevicePath \"\"" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.162122 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.162213 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.776513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerDied","Data":"8a31164fd2d499337255d6b6c8ff41059d39c8ca1c0d9b36dc4180dcc63a2f70"} Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.776636 4962 scope.go:117] "RemoveContainer" containerID="89e68b25becf346a76d704d66f2fa088754a410ce947cc372fb175cd6ff921ab" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.776870 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.814321 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.822330 4962 scope.go:117] "RemoveContainer" containerID="4a3a18f226977c3365d615b122597c40567fbf4037342852763a1652d9c44e94" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.847034 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.866887 4962 scope.go:117] "RemoveContainer" containerID="0dadcfbc9540e03300cad39a3785cea06d52c326b71fdd2b10314f009df918de" Feb 20 11:05:55 crc kubenswrapper[4962]: I0220 11:05:55.158466 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" path="/var/lib/kubelet/pods/a63e8904-d4b9-405f-94a1-f44cb565b3e7/volumes" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.507860 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.508714 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.508802 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.509894 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.510004 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" gracePeriod=600 Feb 20 11:06:11 crc kubenswrapper[4962]: E0220 11:06:11.783839 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.962976 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" exitCode=0 Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.963046 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b"} Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.963097 4962 scope.go:117] "RemoveContainer" containerID="b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.963986 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:06:11 crc kubenswrapper[4962]: E0220 11:06:11.964452 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:06:16 crc kubenswrapper[4962]: I0220 11:06:16.863413 4962 scope.go:117] "RemoveContainer" containerID="1f35394b405a7932de6ea65a40b90b4cfc16077b45269a93e08d860ad8ab92d6" Feb 20 11:06:16 crc kubenswrapper[4962]: I0220 11:06:16.890871 4962 scope.go:117] "RemoveContainer" containerID="9c0af593b4b43f8781c7a5aba922acad47038b25f5521623f3ea1762a03d3532" Feb 20 11:06:16 crc kubenswrapper[4962]: I0220 11:06:16.918039 4962 scope.go:117] "RemoveContainer" containerID="02a91fcfd5ea9a6b32532b3d0e1f6e6d142e678ce53b20f778f4d99dcb76e30e" Feb 20 11:06:23 crc kubenswrapper[4962]: I0220 11:06:23.138860 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:06:23 crc kubenswrapper[4962]: E0220 11:06:23.139689 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:06:34 crc kubenswrapper[4962]: I0220 11:06:34.139115 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:06:34 crc kubenswrapper[4962]: E0220 11:06:34.141048 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:06:46 crc kubenswrapper[4962]: I0220 11:06:46.139853 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:06:46 crc kubenswrapper[4962]: E0220 11:06:46.140840 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:01 crc kubenswrapper[4962]: I0220 11:07:01.139994 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:07:01 crc kubenswrapper[4962]: E0220 11:07:01.141332 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:16 crc kubenswrapper[4962]: I0220 11:07:16.138864 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:07:16 crc kubenswrapper[4962]: E0220 11:07:16.141038 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:31 crc kubenswrapper[4962]: I0220 11:07:31.138953 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:07:31 crc kubenswrapper[4962]: E0220 11:07:31.139926 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:43 crc kubenswrapper[4962]: I0220 11:07:43.139800 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:07:43 crc kubenswrapper[4962]: E0220 11:07:43.141782 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.871162 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:07:52 crc kubenswrapper[4962]: E0220 11:07:52.872125 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="extract-utilities" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.872140 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="extract-utilities" Feb 20 11:07:52 crc kubenswrapper[4962]: E0220 11:07:52.872160 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="registry-server" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.872170 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="registry-server" Feb 20 11:07:52 crc kubenswrapper[4962]: E0220 11:07:52.872181 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="extract-content" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.872189 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="extract-content" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.872365 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="registry-server" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.873644 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.886154 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.059785 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.059849 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5w7n\" (UniqueName: \"kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.059874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.160699 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.160758 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5w7n\" (UniqueName: \"kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.160781 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.161274 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.161529 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.184490 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5w7n\" (UniqueName: \"kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.195404 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.478134 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.869353 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerID="09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f" exitCode=0 Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.869480 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerDied","Data":"09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f"} Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.871556 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerStarted","Data":"b9dc7d0ae4328e3eafdc7473677bb84d1e13ba25cac490a152209bb921d94a90"} Feb 20 11:07:54 crc kubenswrapper[4962]: I0220 11:07:54.138943 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:07:54 crc kubenswrapper[4962]: E0220 11:07:54.139245 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:54 crc kubenswrapper[4962]: I0220 11:07:54.883471 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerStarted","Data":"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b"} Feb 20 11:07:55 crc kubenswrapper[4962]: I0220 11:07:55.895550 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerID="db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b" exitCode=0 Feb 20 11:07:55 crc kubenswrapper[4962]: I0220 11:07:55.895633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerDied","Data":"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b"} Feb 20 11:07:57 crc kubenswrapper[4962]: I0220 11:07:57.915066 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerStarted","Data":"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd"} Feb 20 11:07:57 crc kubenswrapper[4962]: I0220 11:07:57.946400 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-78slw" podStartSLOduration=3.480047705 podStartE2EDuration="5.946381481s" podCreationTimestamp="2026-02-20 11:07:52 +0000 UTC" firstStartedPulling="2026-02-20 11:07:53.871345565 +0000 UTC m=+4365.453817421" lastFinishedPulling="2026-02-20 11:07:56.337679311 +0000 UTC m=+4367.920151197" observedRunningTime="2026-02-20 11:07:57.937731837 +0000 UTC m=+4369.520203723" watchObservedRunningTime="2026-02-20 11:07:57.946381481 +0000 UTC m=+4369.528853337" Feb 20 11:08:03 crc kubenswrapper[4962]: I0220 11:08:03.196537 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:03 crc kubenswrapper[4962]: I0220 11:08:03.197182 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:03 crc kubenswrapper[4962]: I0220 11:08:03.271898 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:04 crc kubenswrapper[4962]: I0220 11:08:04.040096 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:04 crc kubenswrapper[4962]: I0220 11:08:04.104801 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:08:05 crc kubenswrapper[4962]: I0220 11:08:05.139463 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:08:05 crc kubenswrapper[4962]: E0220 11:08:05.140087 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.000374 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-78slw" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="registry-server" containerID="cri-o://1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd" gracePeriod=2 Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.411405 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.582124 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities\") pod \"b8f75fa9-60ea-40ba-861c-78dbce63f152\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.582465 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5w7n\" (UniqueName: \"kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n\") pod \"b8f75fa9-60ea-40ba-861c-78dbce63f152\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.582628 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content\") pod \"b8f75fa9-60ea-40ba-861c-78dbce63f152\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.584090 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities" (OuterVolumeSpecName: "utilities") pod "b8f75fa9-60ea-40ba-861c-78dbce63f152" (UID: "b8f75fa9-60ea-40ba-861c-78dbce63f152"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.590230 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n" (OuterVolumeSpecName: "kube-api-access-m5w7n") pod "b8f75fa9-60ea-40ba-861c-78dbce63f152" (UID: "b8f75fa9-60ea-40ba-861c-78dbce63f152"). InnerVolumeSpecName "kube-api-access-m5w7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.604078 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8f75fa9-60ea-40ba-861c-78dbce63f152" (UID: "b8f75fa9-60ea-40ba-861c-78dbce63f152"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.684159 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.684184 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.684197 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5w7n\" (UniqueName: \"kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.014133 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerID="1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd" exitCode=0 Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.014190 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerDied","Data":"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd"} Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.014239 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerDied","Data":"b9dc7d0ae4328e3eafdc7473677bb84d1e13ba25cac490a152209bb921d94a90"} Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.014246 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.014270 4962 scope.go:117] "RemoveContainer" containerID="1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.045644 4962 scope.go:117] "RemoveContainer" containerID="db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.071544 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.083453 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.085829 4962 scope.go:117] "RemoveContainer" containerID="09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.125061 4962 scope.go:117] "RemoveContainer" containerID="1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd" Feb 20 11:08:07 crc kubenswrapper[4962]: E0220 11:08:07.125776 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd\": container with ID starting with 1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd not found: ID does not exist" containerID="1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.125860 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd"} err="failed to get container status \"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd\": rpc error: code = NotFound desc = could not find container \"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd\": container with ID starting with 1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd not found: ID does not exist" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.125904 4962 scope.go:117] "RemoveContainer" containerID="db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b" Feb 20 11:08:07 crc kubenswrapper[4962]: E0220 11:08:07.126643 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b\": container with ID starting with db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b not found: ID does not exist" containerID="db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.126702 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b"} err="failed to get container status \"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b\": rpc error: code = NotFound desc = could not find container \"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b\": container with ID starting with db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b not found: ID does not exist" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.126743 4962 scope.go:117] "RemoveContainer" containerID="09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f" Feb 20 11:08:07 crc kubenswrapper[4962]: E0220 11:08:07.127212 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f\": container with ID starting with 09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f not found: ID does not exist" containerID="09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.127280 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f"} err="failed to get container status \"09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f\": rpc error: code = NotFound desc = could not find container \"09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f\": container with ID starting with 09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f not found: ID does not exist" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.152242 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" path="/var/lib/kubelet/pods/b8f75fa9-60ea-40ba-861c-78dbce63f152/volumes" Feb 20 11:08:19 crc kubenswrapper[4962]: I0220 11:08:19.148663 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:08:19 crc kubenswrapper[4962]: E0220 11:08:19.150126 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:08:31 crc kubenswrapper[4962]: I0220 11:08:31.138896 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:08:31 crc kubenswrapper[4962]: E0220 11:08:31.141436 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.330897 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-9v9g5"] Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.340589 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-9v9g5"] Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.496080 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jqlcs"] Feb 20 11:08:36 crc kubenswrapper[4962]: E0220 11:08:36.496382 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="extract-utilities" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.496397 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="extract-utilities" Feb 20 11:08:36 crc kubenswrapper[4962]: E0220 11:08:36.496424 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="registry-server" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.496434 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="registry-server" Feb 20 11:08:36 crc kubenswrapper[4962]: E0220 11:08:36.496464 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="extract-content" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.496473 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="extract-content" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.496650 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="registry-server" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.497186 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.499768 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.500766 4962 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-bwxwq" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.505816 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.506025 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.517454 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jqlcs"] Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.597810 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.597915 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.597984 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmtm9\" (UniqueName: \"kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.699932 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.700058 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.700179 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmtm9\" (UniqueName: \"kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.700414 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.701035 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.719826 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmtm9\" (UniqueName: \"kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.862093 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:37 crc kubenswrapper[4962]: I0220 11:08:37.129950 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jqlcs"] Feb 20 11:08:37 crc kubenswrapper[4962]: I0220 11:08:37.153879 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6423ea5e-20ed-4977-a842-2bc521939341" path="/var/lib/kubelet/pods/6423ea5e-20ed-4977-a842-2bc521939341/volumes" Feb 20 11:08:37 crc kubenswrapper[4962]: I0220 11:08:37.272820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jqlcs" event={"ID":"661e27e5-1795-405b-af57-a6f0901b654e","Type":"ContainerStarted","Data":"8b05c1d5829bd42422a14fb650178b9eaa0bcb462ebc9138f3f91ed1ce433170"} Feb 20 11:08:38 crc kubenswrapper[4962]: I0220 11:08:38.282813 4962 generic.go:334] "Generic (PLEG): container finished" podID="661e27e5-1795-405b-af57-a6f0901b654e" containerID="0fe78591e142b00ce0a1305d692098cbd58316c27a1f913e16fe16879c83db51" exitCode=0 Feb 20 11:08:38 crc kubenswrapper[4962]: I0220 11:08:38.282893 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jqlcs" event={"ID":"661e27e5-1795-405b-af57-a6f0901b654e","Type":"ContainerDied","Data":"0fe78591e142b00ce0a1305d692098cbd58316c27a1f913e16fe16879c83db51"} Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.664812 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.751254 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage\") pod \"661e27e5-1795-405b-af57-a6f0901b654e\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.751726 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmtm9\" (UniqueName: \"kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9\") pod \"661e27e5-1795-405b-af57-a6f0901b654e\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.751750 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt\") pod \"661e27e5-1795-405b-af57-a6f0901b654e\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.752016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "661e27e5-1795-405b-af57-a6f0901b654e" (UID: "661e27e5-1795-405b-af57-a6f0901b654e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.758875 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9" (OuterVolumeSpecName: "kube-api-access-wmtm9") pod "661e27e5-1795-405b-af57-a6f0901b654e" (UID: "661e27e5-1795-405b-af57-a6f0901b654e"). InnerVolumeSpecName "kube-api-access-wmtm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.770277 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "661e27e5-1795-405b-af57-a6f0901b654e" (UID: "661e27e5-1795-405b-af57-a6f0901b654e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.853155 4962 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.853218 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmtm9\" (UniqueName: \"kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.853239 4962 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:40 crc kubenswrapper[4962]: I0220 11:08:40.304405 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jqlcs" event={"ID":"661e27e5-1795-405b-af57-a6f0901b654e","Type":"ContainerDied","Data":"8b05c1d5829bd42422a14fb650178b9eaa0bcb462ebc9138f3f91ed1ce433170"} Feb 20 11:08:40 crc kubenswrapper[4962]: I0220 11:08:40.304467 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b05c1d5829bd42422a14fb650178b9eaa0bcb462ebc9138f3f91ed1ce433170" Feb 20 11:08:40 crc kubenswrapper[4962]: I0220 11:08:40.304508 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.059832 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jqlcs"] Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.070363 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jqlcs"] Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.139102 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:08:42 crc kubenswrapper[4962]: E0220 11:08:42.139665 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.227827 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ld4sw"] Feb 20 11:08:42 crc kubenswrapper[4962]: E0220 11:08:42.228133 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="661e27e5-1795-405b-af57-a6f0901b654e" containerName="storage" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.228147 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="661e27e5-1795-405b-af57-a6f0901b654e" containerName="storage" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.228309 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="661e27e5-1795-405b-af57-a6f0901b654e" containerName="storage" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.228807 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.232054 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.232285 4962 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-bwxwq" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.233828 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.233995 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.238283 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ld4sw"] Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.400671 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.400735 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spx96\" (UniqueName: \"kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.400762 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.501707 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.501948 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.502001 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spx96\" (UniqueName: \"kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.502441 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.503019 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.530727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spx96\" (UniqueName: \"kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.545937 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.858312 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ld4sw"] Feb 20 11:08:43 crc kubenswrapper[4962]: I0220 11:08:43.153844 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="661e27e5-1795-405b-af57-a6f0901b654e" path="/var/lib/kubelet/pods/661e27e5-1795-405b-af57-a6f0901b654e/volumes" Feb 20 11:08:43 crc kubenswrapper[4962]: I0220 11:08:43.329513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ld4sw" event={"ID":"b357a94b-d688-4db2-9693-581cf3d3a650","Type":"ContainerStarted","Data":"19b7abe0d5962436e029d44248d9735517a68c8b28538521d0159ccd5a176372"} Feb 20 11:08:44 crc kubenswrapper[4962]: I0220 11:08:44.341484 4962 generic.go:334] "Generic (PLEG): container finished" podID="b357a94b-d688-4db2-9693-581cf3d3a650" containerID="a3e551b0efed20f0f0350f061d20e398a0e2cf1b98045b42cb931ca25aaedfe9" exitCode=0 Feb 20 11:08:44 crc kubenswrapper[4962]: I0220 11:08:44.341575 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ld4sw" event={"ID":"b357a94b-d688-4db2-9693-581cf3d3a650","Type":"ContainerDied","Data":"a3e551b0efed20f0f0350f061d20e398a0e2cf1b98045b42cb931ca25aaedfe9"} Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.742303 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.786416 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spx96\" (UniqueName: \"kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96\") pod \"b357a94b-d688-4db2-9693-581cf3d3a650\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.786526 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt\") pod \"b357a94b-d688-4db2-9693-581cf3d3a650\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.786568 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage\") pod \"b357a94b-d688-4db2-9693-581cf3d3a650\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.787220 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "b357a94b-d688-4db2-9693-581cf3d3a650" (UID: "b357a94b-d688-4db2-9693-581cf3d3a650"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.794801 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96" (OuterVolumeSpecName: "kube-api-access-spx96") pod "b357a94b-d688-4db2-9693-581cf3d3a650" (UID: "b357a94b-d688-4db2-9693-581cf3d3a650"). InnerVolumeSpecName "kube-api-access-spx96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.809541 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "b357a94b-d688-4db2-9693-581cf3d3a650" (UID: "b357a94b-d688-4db2-9693-581cf3d3a650"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.889001 4962 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.889046 4962 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.889071 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spx96\" (UniqueName: \"kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:46 crc kubenswrapper[4962]: I0220 11:08:46.362968 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ld4sw" event={"ID":"b357a94b-d688-4db2-9693-581cf3d3a650","Type":"ContainerDied","Data":"19b7abe0d5962436e029d44248d9735517a68c8b28538521d0159ccd5a176372"} Feb 20 11:08:46 crc kubenswrapper[4962]: I0220 11:08:46.363025 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b7abe0d5962436e029d44248d9735517a68c8b28538521d0159ccd5a176372" Feb 20 11:08:46 crc kubenswrapper[4962]: I0220 11:08:46.363064 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:53 crc kubenswrapper[4962]: I0220 11:08:53.148127 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:08:53 crc kubenswrapper[4962]: E0220 11:08:53.149238 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:09:05 crc kubenswrapper[4962]: I0220 11:09:05.139490 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:09:05 crc kubenswrapper[4962]: E0220 11:09:05.140851 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:09:17 crc kubenswrapper[4962]: I0220 11:09:17.056101 4962 scope.go:117] "RemoveContainer" containerID="dd81866a8883595a9a43e5321d2a1e397058906782cb6839d22126aa9d907feb" Feb 20 11:09:17 crc kubenswrapper[4962]: I0220 11:09:17.140363 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:09:17 crc kubenswrapper[4962]: E0220 11:09:17.140742 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:09:32 crc kubenswrapper[4962]: I0220 11:09:32.138842 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:09:32 crc kubenswrapper[4962]: E0220 11:09:32.139873 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:09:43 crc kubenswrapper[4962]: I0220 11:09:43.138935 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:09:43 crc kubenswrapper[4962]: E0220 11:09:43.140019 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:09:56 crc kubenswrapper[4962]: I0220 11:09:56.140458 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:09:56 crc kubenswrapper[4962]: E0220 11:09:56.143848 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:10:11 crc kubenswrapper[4962]: I0220 11:10:11.140127 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:10:11 crc kubenswrapper[4962]: E0220 11:10:11.141145 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:10:22 crc kubenswrapper[4962]: I0220 11:10:22.139013 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:10:22 crc kubenswrapper[4962]: E0220 11:10:22.139843 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:10:34 crc kubenswrapper[4962]: I0220 11:10:34.139740 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:10:34 crc kubenswrapper[4962]: E0220 11:10:34.141088 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:10:45 crc kubenswrapper[4962]: I0220 11:10:45.139498 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:10:45 crc kubenswrapper[4962]: E0220 11:10:45.140496 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:11:00 crc kubenswrapper[4962]: I0220 11:11:00.140010 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:11:00 crc kubenswrapper[4962]: E0220 11:11:00.141088 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:11:13 crc kubenswrapper[4962]: I0220 11:11:13.139832 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:11:13 crc kubenswrapper[4962]: I0220 11:11:13.723247 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26"} Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.086028 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:33 crc kubenswrapper[4962]: E0220 11:11:33.089816 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b357a94b-d688-4db2-9693-581cf3d3a650" containerName="storage" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.089855 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b357a94b-d688-4db2-9693-581cf3d3a650" containerName="storage" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.090153 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b357a94b-d688-4db2-9693-581cf3d3a650" containerName="storage" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.091978 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.103111 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.260165 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.260258 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7w7p\" (UniqueName: \"kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.260373 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.362198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.362823 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.363543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7w7p\" (UniqueName: \"kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.363157 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.362859 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.389195 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7w7p\" (UniqueName: \"kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.427311 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.678402 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:33 crc kubenswrapper[4962]: W0220 11:11:33.682844 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8123f297_3029_4d9d_a922_2b771aed43c0.slice/crio-f0b7812f45f8eea28efbe319212752e44998e182a92f8aef3e652098743c2480 WatchSource:0}: Error finding container f0b7812f45f8eea28efbe319212752e44998e182a92f8aef3e652098743c2480: Status 404 returned error can't find the container with id f0b7812f45f8eea28efbe319212752e44998e182a92f8aef3e652098743c2480 Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.895844 4962 generic.go:334] "Generic (PLEG): container finished" podID="8123f297-3029-4d9d-a922-2b771aed43c0" containerID="feabc8c4cf26db6fa376158de7d8e770dde59ef8215527032dd3ebcbccf03e3a" exitCode=0 Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.895891 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerDied","Data":"feabc8c4cf26db6fa376158de7d8e770dde59ef8215527032dd3ebcbccf03e3a"} Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.895914 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerStarted","Data":"f0b7812f45f8eea28efbe319212752e44998e182a92f8aef3e652098743c2480"} Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.897500 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 11:11:34 crc kubenswrapper[4962]: I0220 11:11:34.907638 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerStarted","Data":"236bd57b1e0c8740564a579c282fdc2b610fbbfad7b4be819e1c8f8557919d31"} Feb 20 11:11:35 crc kubenswrapper[4962]: I0220 11:11:35.920396 4962 generic.go:334] "Generic (PLEG): container finished" podID="8123f297-3029-4d9d-a922-2b771aed43c0" containerID="236bd57b1e0c8740564a579c282fdc2b610fbbfad7b4be819e1c8f8557919d31" exitCode=0 Feb 20 11:11:35 crc kubenswrapper[4962]: I0220 11:11:35.920470 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerDied","Data":"236bd57b1e0c8740564a579c282fdc2b610fbbfad7b4be819e1c8f8557919d31"} Feb 20 11:11:36 crc kubenswrapper[4962]: I0220 11:11:36.934422 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerStarted","Data":"46b0d85e7bc6514ce0e6a1314fb4afc736066f8b657041944635fdc15ec9b17d"} Feb 20 11:11:36 crc kubenswrapper[4962]: I0220 11:11:36.958400 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gf65t" podStartSLOduration=1.5143869250000002 podStartE2EDuration="3.958377457s" podCreationTimestamp="2026-02-20 11:11:33 +0000 UTC" firstStartedPulling="2026-02-20 11:11:33.897294193 +0000 UTC m=+4585.479766039" lastFinishedPulling="2026-02-20 11:11:36.341284685 +0000 UTC m=+4587.923756571" observedRunningTime="2026-02-20 11:11:36.954085586 +0000 UTC m=+4588.536557462" watchObservedRunningTime="2026-02-20 11:11:36.958377457 +0000 UTC m=+4588.540849343" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.479848 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.481949 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.497938 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.633159 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.633463 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.633668 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk4gh\" (UniqueName: \"kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.735629 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk4gh\" (UniqueName: \"kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.735746 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.735797 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.736460 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.736790 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.776516 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk4gh\" (UniqueName: \"kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.798037 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:38 crc kubenswrapper[4962]: I0220 11:11:38.112690 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:38 crc kubenswrapper[4962]: I0220 11:11:38.957038 4962 generic.go:334] "Generic (PLEG): container finished" podID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerID="9ee74111c1ba86e5709005fcbe78e4bf5aa89be27bca01ef0b469ef1b5c60efd" exitCode=0 Feb 20 11:11:38 crc kubenswrapper[4962]: I0220 11:11:38.957143 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerDied","Data":"9ee74111c1ba86e5709005fcbe78e4bf5aa89be27bca01ef0b469ef1b5c60efd"} Feb 20 11:11:38 crc kubenswrapper[4962]: I0220 11:11:38.957340 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerStarted","Data":"044ddc12238627b5579928a6941e216bc98eccf5658fc5ebe3202f9d66cede0e"} Feb 20 11:11:39 crc kubenswrapper[4962]: I0220 11:11:39.966764 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerStarted","Data":"a2c53e38bc22bb8389498260263ad69325ab45f3c976482fce0fcee721e543fa"} Feb 20 11:11:40 crc kubenswrapper[4962]: I0220 11:11:40.976424 4962 generic.go:334] "Generic (PLEG): container finished" podID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerID="a2c53e38bc22bb8389498260263ad69325ab45f3c976482fce0fcee721e543fa" exitCode=0 Feb 20 11:11:40 crc kubenswrapper[4962]: I0220 11:11:40.976489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerDied","Data":"a2c53e38bc22bb8389498260263ad69325ab45f3c976482fce0fcee721e543fa"} Feb 20 11:11:41 crc kubenswrapper[4962]: I0220 11:11:41.990076 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerStarted","Data":"4ccbd6d45940c4b1ae7e0e1f68c265065827d64d2c523d3f5bea75e59a5d57b0"} Feb 20 11:11:42 crc kubenswrapper[4962]: I0220 11:11:42.023735 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n2k4b" podStartSLOduration=2.422794915 podStartE2EDuration="5.023710711s" podCreationTimestamp="2026-02-20 11:11:37 +0000 UTC" firstStartedPulling="2026-02-20 11:11:38.959951134 +0000 UTC m=+4590.542422980" lastFinishedPulling="2026-02-20 11:11:41.56086692 +0000 UTC m=+4593.143338776" observedRunningTime="2026-02-20 11:11:42.015423427 +0000 UTC m=+4593.597895313" watchObservedRunningTime="2026-02-20 11:11:42.023710711 +0000 UTC m=+4593.606182597" Feb 20 11:11:43 crc kubenswrapper[4962]: I0220 11:11:43.428350 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:43 crc kubenswrapper[4962]: I0220 11:11:43.428526 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:44 crc kubenswrapper[4962]: I0220 11:11:44.484904 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gf65t" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="registry-server" probeResult="failure" output=< Feb 20 11:11:44 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 11:11:44 crc kubenswrapper[4962]: > Feb 20 11:11:47 crc kubenswrapper[4962]: I0220 11:11:47.798878 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:47 crc kubenswrapper[4962]: I0220 11:11:47.799126 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:47 crc kubenswrapper[4962]: I0220 11:11:47.875595 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:48 crc kubenswrapper[4962]: I0220 11:11:48.116316 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:50 crc kubenswrapper[4962]: I0220 11:11:50.665814 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:50 crc kubenswrapper[4962]: I0220 11:11:50.666446 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n2k4b" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="registry-server" containerID="cri-o://4ccbd6d45940c4b1ae7e0e1f68c265065827d64d2c523d3f5bea75e59a5d57b0" gracePeriod=2 Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.075409 4962 generic.go:334] "Generic (PLEG): container finished" podID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerID="4ccbd6d45940c4b1ae7e0e1f68c265065827d64d2c523d3f5bea75e59a5d57b0" exitCode=0 Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.075914 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerDied","Data":"4ccbd6d45940c4b1ae7e0e1f68c265065827d64d2c523d3f5bea75e59a5d57b0"} Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.075943 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerDied","Data":"044ddc12238627b5579928a6941e216bc98eccf5658fc5ebe3202f9d66cede0e"} Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.075957 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="044ddc12238627b5579928a6941e216bc98eccf5658fc5ebe3202f9d66cede0e" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.103229 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.256660 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content\") pod \"96f2df87-1a45-4012-9cb8-cfe7722350d6\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.256711 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities\") pod \"96f2df87-1a45-4012-9cb8-cfe7722350d6\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.256736 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk4gh\" (UniqueName: \"kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh\") pod \"96f2df87-1a45-4012-9cb8-cfe7722350d6\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.258374 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities" (OuterVolumeSpecName: "utilities") pod "96f2df87-1a45-4012-9cb8-cfe7722350d6" (UID: "96f2df87-1a45-4012-9cb8-cfe7722350d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.267022 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh" (OuterVolumeSpecName: "kube-api-access-xk4gh") pod "96f2df87-1a45-4012-9cb8-cfe7722350d6" (UID: "96f2df87-1a45-4012-9cb8-cfe7722350d6"). InnerVolumeSpecName "kube-api-access-xk4gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.315173 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96f2df87-1a45-4012-9cb8-cfe7722350d6" (UID: "96f2df87-1a45-4012-9cb8-cfe7722350d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.358045 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.358089 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.358108 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk4gh\" (UniqueName: \"kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:53 crc kubenswrapper[4962]: I0220 11:11:53.085568 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:53 crc kubenswrapper[4962]: I0220 11:11:53.152084 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:53 crc kubenswrapper[4962]: I0220 11:11:53.159748 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:53 crc kubenswrapper[4962]: I0220 11:11:53.505615 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:53 crc kubenswrapper[4962]: I0220 11:11:53.553020 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:55 crc kubenswrapper[4962]: I0220 11:11:55.149725 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" path="/var/lib/kubelet/pods/96f2df87-1a45-4012-9cb8-cfe7722350d6/volumes" Feb 20 11:11:55 crc kubenswrapper[4962]: I0220 11:11:55.868361 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:55 crc kubenswrapper[4962]: I0220 11:11:55.868793 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gf65t" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="registry-server" containerID="cri-o://46b0d85e7bc6514ce0e6a1314fb4afc736066f8b657041944635fdc15ec9b17d" gracePeriod=2 Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.115037 4962 generic.go:334] "Generic (PLEG): container finished" podID="8123f297-3029-4d9d-a922-2b771aed43c0" containerID="46b0d85e7bc6514ce0e6a1314fb4afc736066f8b657041944635fdc15ec9b17d" exitCode=0 Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.115105 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerDied","Data":"46b0d85e7bc6514ce0e6a1314fb4afc736066f8b657041944635fdc15ec9b17d"} Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.400258 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.520992 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities\") pod \"8123f297-3029-4d9d-a922-2b771aed43c0\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.521484 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7w7p\" (UniqueName: \"kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p\") pod \"8123f297-3029-4d9d-a922-2b771aed43c0\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.521796 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content\") pod \"8123f297-3029-4d9d-a922-2b771aed43c0\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.532186 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p" (OuterVolumeSpecName: "kube-api-access-g7w7p") pod "8123f297-3029-4d9d-a922-2b771aed43c0" (UID: "8123f297-3029-4d9d-a922-2b771aed43c0"). InnerVolumeSpecName "kube-api-access-g7w7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.532915 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities" (OuterVolumeSpecName: "utilities") pod "8123f297-3029-4d9d-a922-2b771aed43c0" (UID: "8123f297-3029-4d9d-a922-2b771aed43c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.623803 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7w7p\" (UniqueName: \"kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.623879 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.670022 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8123f297-3029-4d9d-a922-2b771aed43c0" (UID: "8123f297-3029-4d9d-a922-2b771aed43c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.725380 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.127566 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerDied","Data":"f0b7812f45f8eea28efbe319212752e44998e182a92f8aef3e652098743c2480"} Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.127648 4962 scope.go:117] "RemoveContainer" containerID="46b0d85e7bc6514ce0e6a1314fb4afc736066f8b657041944635fdc15ec9b17d" Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.128236 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.154626 4962 scope.go:117] "RemoveContainer" containerID="236bd57b1e0c8740564a579c282fdc2b610fbbfad7b4be819e1c8f8557919d31" Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.182014 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.192678 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.208398 4962 scope.go:117] "RemoveContainer" containerID="feabc8c4cf26db6fa376158de7d8e770dde59ef8215527032dd3ebcbccf03e3a" Feb 20 11:11:59 crc kubenswrapper[4962]: I0220 11:11:59.156967 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" path="/var/lib/kubelet/pods/8123f297-3029-4d9d-a922-2b771aed43c0/volumes" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978001 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978495 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="extract-utilities" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978507 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="extract-utilities" Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978520 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="extract-content" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978526 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="extract-content" Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978539 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978545 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978554 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="extract-content" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978560 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="extract-content" Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978572 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978579 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978601 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="extract-utilities" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978607 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="extract-utilities" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978723 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978741 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.979380 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.981371 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.981467 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7q92k" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.981468 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.982296 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.983097 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.014497 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-62bkq"] Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.016169 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.025954 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-62bkq"] Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.039529 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081723 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch6zx\" (UniqueName: \"kubernetes.io/projected/10021bed-f80b-491c-8326-88df1a07c1f7-kube-api-access-ch6zx\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081781 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-dns-svc\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081852 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv2pl\" (UniqueName: \"kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081914 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-config\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081953 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182875 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch6zx\" (UniqueName: \"kubernetes.io/projected/10021bed-f80b-491c-8326-88df1a07c1f7-kube-api-access-ch6zx\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182895 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182914 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-dns-svc\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv2pl\" (UniqueName: \"kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182993 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-config\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.183681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.183838 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-config\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.184047 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.184741 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-dns-svc\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.219771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch6zx\" (UniqueName: \"kubernetes.io/projected/10021bed-f80b-491c-8326-88df1a07c1f7-kube-api-access-ch6zx\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.226341 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv2pl\" (UniqueName: \"kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.294908 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.335138 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.721054 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.787156 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-62bkq"] Feb 20 11:12:07 crc kubenswrapper[4962]: W0220 11:12:07.797123 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10021bed_f80b_491c_8326_88df1a07c1f7.slice/crio-e47db261c2bfbeaecffbda8de857bedb2294d41db3b322ddd15f80e8ba9d53f1 WatchSource:0}: Error finding container e47db261c2bfbeaecffbda8de857bedb2294d41db3b322ddd15f80e8ba9d53f1: Status 404 returned error can't find the container with id e47db261c2bfbeaecffbda8de857bedb2294d41db3b322ddd15f80e8ba9d53f1 Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.136137 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.137923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.140681 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.140692 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.141367 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.141920 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gjq7z" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.143682 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.164674 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhgp7\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-kube-api-access-mhgp7\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203109 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203190 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203344 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203427 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203498 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203528 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203653 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.239269 4962 generic.go:334] "Generic (PLEG): container finished" podID="a26573eb-419d-4ead-b747-2cc004252564" containerID="6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570" exitCode=0 Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.239337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" event={"ID":"a26573eb-419d-4ead-b747-2cc004252564","Type":"ContainerDied","Data":"6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570"} Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.239577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" event={"ID":"a26573eb-419d-4ead-b747-2cc004252564","Type":"ContainerStarted","Data":"588013732c3631c15df27205354a3f5d50e66c6de81609620c23fca9c83b06f2"} Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.241491 4962 generic.go:334] "Generic (PLEG): container finished" podID="10021bed-f80b-491c-8326-88df1a07c1f7" containerID="d7bfd7bc38a8dbbd8ca5b64b18b808624fe45d14c18059530aa5d5534f381ec4" exitCode=0 Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.241532 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" event={"ID":"10021bed-f80b-491c-8326-88df1a07c1f7","Type":"ContainerDied","Data":"d7bfd7bc38a8dbbd8ca5b64b18b808624fe45d14c18059530aa5d5534f381ec4"} Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.241570 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" event={"ID":"10021bed-f80b-491c-8326-88df1a07c1f7","Type":"ContainerStarted","Data":"e47db261c2bfbeaecffbda8de857bedb2294d41db3b322ddd15f80e8ba9d53f1"} Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.305046 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhgp7\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-kube-api-access-mhgp7\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.305232 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.305333 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.305639 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.305911 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.306025 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.306098 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.306186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.306250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.309198 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.309484 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.310010 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.310126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.314521 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.316219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.318324 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.318370 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/07545b54426f40b00bb4c13cc1f9fe59b7b4a09fd52fcbea77ec3ff6291e7b54/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.322311 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhgp7\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-kube-api-access-mhgp7\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.323235 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.371191 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.467905 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.897404 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 11:12:08 crc kubenswrapper[4962]: W0220 11:12:08.904747 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f1374d6_d1c8_4b28_a524_485ced8ec7b9.slice/crio-e1056d0a030531abe426fcf2534e980a4f222a18abda0f16601c5fd299837136 WatchSource:0}: Error finding container e1056d0a030531abe426fcf2534e980a4f222a18abda0f16601c5fd299837136: Status 404 returned error can't find the container with id e1056d0a030531abe426fcf2534e980a4f222a18abda0f16601c5fd299837136 Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.253155 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" event={"ID":"a26573eb-419d-4ead-b747-2cc004252564","Type":"ContainerStarted","Data":"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa"} Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.255617 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.255814 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" event={"ID":"10021bed-f80b-491c-8326-88df1a07c1f7","Type":"ContainerStarted","Data":"afbd6f1bea126f3c4967c55326ee4f10ece6c77611d27a88e91672c7cb7e01b8"} Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.256029 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.259263 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f1374d6-d1c8-4b28-a524-485ced8ec7b9","Type":"ContainerStarted","Data":"e1056d0a030531abe426fcf2534e980a4f222a18abda0f16601c5fd299837136"} Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.272332 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" podStartSLOduration=3.272311217 podStartE2EDuration="3.272311217s" podCreationTimestamp="2026-02-20 11:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:09.270817561 +0000 UTC m=+4620.853289447" watchObservedRunningTime="2026-02-20 11:12:09.272311217 +0000 UTC m=+4620.854783063" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.289035 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" podStartSLOduration=3.289019749 podStartE2EDuration="3.289019749s" podCreationTimestamp="2026-02-20 11:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:09.287420839 +0000 UTC m=+4620.869892685" watchObservedRunningTime="2026-02-20 11:12:09.289019749 +0000 UTC m=+4620.871491595" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.481189 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.483130 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.487354 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-s7pjk" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.487419 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.488647 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.490298 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.505270 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.511162 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642772 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642822 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642851 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgx9s\" (UniqueName: \"kubernetes.io/projected/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kube-api-access-hgx9s\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642959 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642983 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642999 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-default\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.643019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.643054 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kolla-config\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.745891 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746313 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-default\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746353 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746388 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kolla-config\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746417 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746439 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746461 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgx9s\" (UniqueName: \"kubernetes.io/projected/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kube-api-access-hgx9s\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.749268 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-default\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.749521 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.749732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.749849 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kolla-config\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.751459 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.751489 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/82af7e16ae6a26f5ab26562d44f8c6d5bb94bc45afb9982e337b9181f1f053f8/globalmount\"" pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.766997 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.772820 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.788743 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgx9s\" (UniqueName: \"kubernetes.io/projected/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kube-api-access-hgx9s\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.793765 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.815545 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.070664 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.082326 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.084622 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.085105 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-q28bt" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.090448 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.200181 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.252815 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kolla-config\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.252851 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4gn5\" (UniqueName: \"kubernetes.io/projected/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kube-api-access-m4gn5\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.252876 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-config-data\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.268201 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2ffa3bc-ffbe-4a42-b14f-48aa20546210","Type":"ContainerStarted","Data":"185492e6f361b7a6ca14d565e54e7d3405c3d4ed383de858a246ebc0de2df704"} Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.356130 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kolla-config\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.356249 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4gn5\" (UniqueName: \"kubernetes.io/projected/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kube-api-access-m4gn5\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.356291 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-config-data\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.358128 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-config-data\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.358485 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kolla-config\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.373083 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4gn5\" (UniqueName: \"kubernetes.io/projected/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kube-api-access-m4gn5\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.418173 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.865947 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 11:12:10 crc kubenswrapper[4962]: W0220 11:12:10.867990 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8374d0f9_f4be_4f6b_88eb_4849a2be49e9.slice/crio-87ca5265edccf24fde85ac2152d83a73395c38eb59fdfed628e92b1693c91747 WatchSource:0}: Error finding container 87ca5265edccf24fde85ac2152d83a73395c38eb59fdfed628e92b1693c91747: Status 404 returned error can't find the container with id 87ca5265edccf24fde85ac2152d83a73395c38eb59fdfed628e92b1693c91747 Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.094070 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.095490 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.100332 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.100546 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.100725 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.100922 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mpsfv" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.108514 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.269440 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270145 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270219 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xtvz\" (UniqueName: \"kubernetes.io/projected/97ae547e-e977-4b15-a979-38415ee77885-kube-api-access-7xtvz\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270253 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270292 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/97ae547e-e977-4b15-a979-38415ee77885-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270340 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270393 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4212cde7-983e-4a29-b847-21233e7ce523\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4212cde7-983e-4a29-b847-21233e7ce523\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270435 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.278750 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f1374d6-d1c8-4b28-a524-485ced8ec7b9","Type":"ContainerStarted","Data":"ea699931b3b6d1154f7563dc0e7c455f7597120eb0a07bf4c657255d11998dcb"} Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.280510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2ffa3bc-ffbe-4a42-b14f-48aa20546210","Type":"ContainerStarted","Data":"2b549c0faa264bbd60d3323fce20c7bbdb66db97b9e039a78354cbddfca9fde0"} Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.284490 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8374d0f9-f4be-4f6b-88eb-4849a2be49e9","Type":"ContainerStarted","Data":"f9580210b6730fa5555dcbac945cd9238c8dea64942334e413e20cbfb558a662"} Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.284543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8374d0f9-f4be-4f6b-88eb-4849a2be49e9","Type":"ContainerStarted","Data":"87ca5265edccf24fde85ac2152d83a73395c38eb59fdfed628e92b1693c91747"} Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.285168 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.351018 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.350992115 podStartE2EDuration="1.350992115s" podCreationTimestamp="2026-02-20 11:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:11.344259409 +0000 UTC m=+4622.926731255" watchObservedRunningTime="2026-02-20 11:12:11.350992115 +0000 UTC m=+4622.933463961" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372286 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372392 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xtvz\" (UniqueName: \"kubernetes.io/projected/97ae547e-e977-4b15-a979-38415ee77885-kube-api-access-7xtvz\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372485 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372526 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/97ae547e-e977-4b15-a979-38415ee77885-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372559 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372645 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4212cde7-983e-4a29-b847-21233e7ce523\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4212cde7-983e-4a29-b847-21233e7ce523\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372688 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.377242 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.377399 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/97ae547e-e977-4b15-a979-38415ee77885-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.377429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.378171 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.379883 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.380256 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4212cde7-983e-4a29-b847-21233e7ce523\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4212cde7-983e-4a29-b847-21233e7ce523\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9baade218bdce1757add9c2b3a768cfc65cf332cf4ef6807977bf89c1521c62b/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.384398 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.385251 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.410281 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xtvz\" (UniqueName: \"kubernetes.io/projected/97ae547e-e977-4b15-a979-38415ee77885-kube-api-access-7xtvz\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.414528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4212cde7-983e-4a29-b847-21233e7ce523\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4212cde7-983e-4a29-b847-21233e7ce523\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.712867 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:12 crc kubenswrapper[4962]: I0220 11:12:12.184648 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 11:12:12 crc kubenswrapper[4962]: I0220 11:12:12.298398 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"97ae547e-e977-4b15-a979-38415ee77885","Type":"ContainerStarted","Data":"6411c1a34832bdf05020f42bd380f9bcf0fb50e356efafde3c7279008c8478c1"} Feb 20 11:12:13 crc kubenswrapper[4962]: I0220 11:12:13.309286 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"97ae547e-e977-4b15-a979-38415ee77885","Type":"ContainerStarted","Data":"12ab0d1167bdad63132aac820b88ba53b430b6ab81fb79e02b3560ce10a49d27"} Feb 20 11:12:14 crc kubenswrapper[4962]: I0220 11:12:14.326642 4962 generic.go:334] "Generic (PLEG): container finished" podID="f2ffa3bc-ffbe-4a42-b14f-48aa20546210" containerID="2b549c0faa264bbd60d3323fce20c7bbdb66db97b9e039a78354cbddfca9fde0" exitCode=0 Feb 20 11:12:14 crc kubenswrapper[4962]: I0220 11:12:14.326712 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2ffa3bc-ffbe-4a42-b14f-48aa20546210","Type":"ContainerDied","Data":"2b549c0faa264bbd60d3323fce20c7bbdb66db97b9e039a78354cbddfca9fde0"} Feb 20 11:12:15 crc kubenswrapper[4962]: I0220 11:12:15.340734 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2ffa3bc-ffbe-4a42-b14f-48aa20546210","Type":"ContainerStarted","Data":"71b411cb31dee0e1edef65607970ddb4ab5d4d6ff0b65b7389774037c8b712f7"} Feb 20 11:12:15 crc kubenswrapper[4962]: I0220 11:12:15.382670 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.382637163 podStartE2EDuration="7.382637163s" podCreationTimestamp="2026-02-20 11:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:15.374354819 +0000 UTC m=+4626.956826725" watchObservedRunningTime="2026-02-20 11:12:15.382637163 +0000 UTC m=+4626.965109069" Feb 20 11:12:15 crc kubenswrapper[4962]: I0220 11:12:15.420149 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 20 11:12:16 crc kubenswrapper[4962]: I0220 11:12:16.351520 4962 generic.go:334] "Generic (PLEG): container finished" podID="97ae547e-e977-4b15-a979-38415ee77885" containerID="12ab0d1167bdad63132aac820b88ba53b430b6ab81fb79e02b3560ce10a49d27" exitCode=0 Feb 20 11:12:16 crc kubenswrapper[4962]: I0220 11:12:16.351652 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"97ae547e-e977-4b15-a979-38415ee77885","Type":"ContainerDied","Data":"12ab0d1167bdad63132aac820b88ba53b430b6ab81fb79e02b3560ce10a49d27"} Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.296875 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.337284 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.377708 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"97ae547e-e977-4b15-a979-38415ee77885","Type":"ContainerStarted","Data":"7c8d74176f301a3b06823f2556b09390524ebb2cd8385ab222b25d929d3aae70"} Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.409554 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.409810 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="dnsmasq-dns" containerID="cri-o://fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa" gracePeriod=10 Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.420019 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.419995626 podStartE2EDuration="7.419995626s" podCreationTimestamp="2026-02-20 11:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:17.414801207 +0000 UTC m=+4628.997273093" watchObservedRunningTime="2026-02-20 11:12:17.419995626 +0000 UTC m=+4629.002467482" Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.890070 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.013993 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config\") pod \"a26573eb-419d-4ead-b747-2cc004252564\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.014127 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc\") pod \"a26573eb-419d-4ead-b747-2cc004252564\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.014201 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv2pl\" (UniqueName: \"kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl\") pod \"a26573eb-419d-4ead-b747-2cc004252564\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.024811 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl" (OuterVolumeSpecName: "kube-api-access-rv2pl") pod "a26573eb-419d-4ead-b747-2cc004252564" (UID: "a26573eb-419d-4ead-b747-2cc004252564"). InnerVolumeSpecName "kube-api-access-rv2pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.065197 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a26573eb-419d-4ead-b747-2cc004252564" (UID: "a26573eb-419d-4ead-b747-2cc004252564"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.068444 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config" (OuterVolumeSpecName: "config") pod "a26573eb-419d-4ead-b747-2cc004252564" (UID: "a26573eb-419d-4ead-b747-2cc004252564"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.116607 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.116639 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.116649 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv2pl\" (UniqueName: \"kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.389960 4962 generic.go:334] "Generic (PLEG): container finished" podID="a26573eb-419d-4ead-b747-2cc004252564" containerID="fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa" exitCode=0 Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.390050 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.390089 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" event={"ID":"a26573eb-419d-4ead-b747-2cc004252564","Type":"ContainerDied","Data":"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa"} Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.390219 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" event={"ID":"a26573eb-419d-4ead-b747-2cc004252564","Type":"ContainerDied","Data":"588013732c3631c15df27205354a3f5d50e66c6de81609620c23fca9c83b06f2"} Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.390265 4962 scope.go:117] "RemoveContainer" containerID="fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.421434 4962 scope.go:117] "RemoveContainer" containerID="6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.443201 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.450782 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.659328 4962 scope.go:117] "RemoveContainer" containerID="fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa" Feb 20 11:12:18 crc kubenswrapper[4962]: E0220 11:12:18.660047 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa\": container with ID starting with fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa not found: ID does not exist" containerID="fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.660097 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa"} err="failed to get container status \"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa\": rpc error: code = NotFound desc = could not find container \"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa\": container with ID starting with fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa not found: ID does not exist" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.660131 4962 scope.go:117] "RemoveContainer" containerID="6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570" Feb 20 11:12:18 crc kubenswrapper[4962]: E0220 11:12:18.660483 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570\": container with ID starting with 6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570 not found: ID does not exist" containerID="6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.660520 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570"} err="failed to get container status \"6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570\": rpc error: code = NotFound desc = could not find container \"6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570\": container with ID starting with 6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570 not found: ID does not exist" Feb 20 11:12:19 crc kubenswrapper[4962]: I0220 11:12:19.152278 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a26573eb-419d-4ead-b747-2cc004252564" path="/var/lib/kubelet/pods/a26573eb-419d-4ead-b747-2cc004252564/volumes" Feb 20 11:12:19 crc kubenswrapper[4962]: I0220 11:12:19.815695 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 20 11:12:19 crc kubenswrapper[4962]: I0220 11:12:19.815774 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 20 11:12:21 crc kubenswrapper[4962]: I0220 11:12:21.714833 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:21 crc kubenswrapper[4962]: I0220 11:12:21.715214 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:22 crc kubenswrapper[4962]: I0220 11:12:22.237928 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 20 11:12:22 crc kubenswrapper[4962]: I0220 11:12:22.307583 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 20 11:12:24 crc kubenswrapper[4962]: I0220 11:12:24.133876 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:24 crc kubenswrapper[4962]: I0220 11:12:24.248960 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.461634 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bnx24"] Feb 20 11:12:28 crc kubenswrapper[4962]: E0220 11:12:28.462565 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="init" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.462610 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="init" Feb 20 11:12:28 crc kubenswrapper[4962]: E0220 11:12:28.462621 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="dnsmasq-dns" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.462628 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="dnsmasq-dns" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.462779 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="dnsmasq-dns" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.463521 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.466224 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.491747 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bnx24"] Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.613896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kr6n\" (UniqueName: \"kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.614295 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.716315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kr6n\" (UniqueName: \"kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.717137 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.718441 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.750217 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kr6n\" (UniqueName: \"kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.797425 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:29 crc kubenswrapper[4962]: I0220 11:12:29.322842 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bnx24"] Feb 20 11:12:29 crc kubenswrapper[4962]: W0220 11:12:29.589771 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd14b3c9f_d912_4f57_8df9_6b20338707e5.slice/crio-719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786 WatchSource:0}: Error finding container 719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786: Status 404 returned error can't find the container with id 719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786 Feb 20 11:12:30 crc kubenswrapper[4962]: I0220 11:12:30.536017 4962 generic.go:334] "Generic (PLEG): container finished" podID="d14b3c9f-d912-4f57-8df9-6b20338707e5" containerID="f56f77a29f18b053790ac8a764373585312883ee763879c9fe012ee4ec5c65e1" exitCode=0 Feb 20 11:12:30 crc kubenswrapper[4962]: I0220 11:12:30.536145 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bnx24" event={"ID":"d14b3c9f-d912-4f57-8df9-6b20338707e5","Type":"ContainerDied","Data":"f56f77a29f18b053790ac8a764373585312883ee763879c9fe012ee4ec5c65e1"} Feb 20 11:12:30 crc kubenswrapper[4962]: I0220 11:12:30.536383 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bnx24" event={"ID":"d14b3c9f-d912-4f57-8df9-6b20338707e5","Type":"ContainerStarted","Data":"719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786"} Feb 20 11:12:31 crc kubenswrapper[4962]: I0220 11:12:31.936776 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.070643 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kr6n\" (UniqueName: \"kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n\") pod \"d14b3c9f-d912-4f57-8df9-6b20338707e5\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.070765 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts\") pod \"d14b3c9f-d912-4f57-8df9-6b20338707e5\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.071821 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d14b3c9f-d912-4f57-8df9-6b20338707e5" (UID: "d14b3c9f-d912-4f57-8df9-6b20338707e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.080981 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n" (OuterVolumeSpecName: "kube-api-access-7kr6n") pod "d14b3c9f-d912-4f57-8df9-6b20338707e5" (UID: "d14b3c9f-d912-4f57-8df9-6b20338707e5"). InnerVolumeSpecName "kube-api-access-7kr6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.171918 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.171946 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kr6n\" (UniqueName: \"kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.564877 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bnx24" event={"ID":"d14b3c9f-d912-4f57-8df9-6b20338707e5","Type":"ContainerDied","Data":"719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786"} Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.564919 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.564949 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:35 crc kubenswrapper[4962]: I0220 11:12:35.060056 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bnx24"] Feb 20 11:12:35 crc kubenswrapper[4962]: I0220 11:12:35.066668 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bnx24"] Feb 20 11:12:35 crc kubenswrapper[4962]: I0220 11:12:35.151180 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14b3c9f-d912-4f57-8df9-6b20338707e5" path="/var/lib/kubelet/pods/d14b3c9f-d912-4f57-8df9-6b20338707e5/volumes" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.078048 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2mzpw"] Feb 20 11:12:40 crc kubenswrapper[4962]: E0220 11:12:40.079123 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14b3c9f-d912-4f57-8df9-6b20338707e5" containerName="mariadb-account-create-update" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.079140 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14b3c9f-d912-4f57-8df9-6b20338707e5" containerName="mariadb-account-create-update" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.079312 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14b3c9f-d912-4f57-8df9-6b20338707e5" containerName="mariadb-account-create-update" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.079964 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.082859 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.090638 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2mzpw"] Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.157953 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msxx\" (UniqueName: \"kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.158783 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.260580 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msxx\" (UniqueName: \"kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.260767 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.262057 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.297767 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msxx\" (UniqueName: \"kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.402857 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.936084 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2mzpw"] Feb 20 11:12:40 crc kubenswrapper[4962]: W0220 11:12:40.945678 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ae4e019_31b7_4826_a5ef_042faba6034d.slice/crio-203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f WatchSource:0}: Error finding container 203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f: Status 404 returned error can't find the container with id 203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f Feb 20 11:12:41 crc kubenswrapper[4962]: I0220 11:12:41.672165 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2mzpw" event={"ID":"7ae4e019-31b7-4826-a5ef-042faba6034d","Type":"ContainerDied","Data":"8b9ab6691837647b0967e622ecfd0e62f3ce7907b1cef344ccbbdd1bcb192e5e"} Feb 20 11:12:41 crc kubenswrapper[4962]: I0220 11:12:41.672202 4962 generic.go:334] "Generic (PLEG): container finished" podID="7ae4e019-31b7-4826-a5ef-042faba6034d" containerID="8b9ab6691837647b0967e622ecfd0e62f3ce7907b1cef344ccbbdd1bcb192e5e" exitCode=0 Feb 20 11:12:41 crc kubenswrapper[4962]: I0220 11:12:41.672685 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2mzpw" event={"ID":"7ae4e019-31b7-4826-a5ef-042faba6034d","Type":"ContainerStarted","Data":"203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f"} Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.095965 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.229293 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5msxx\" (UniqueName: \"kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx\") pod \"7ae4e019-31b7-4826-a5ef-042faba6034d\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.229477 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts\") pod \"7ae4e019-31b7-4826-a5ef-042faba6034d\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.230298 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ae4e019-31b7-4826-a5ef-042faba6034d" (UID: "7ae4e019-31b7-4826-a5ef-042faba6034d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.235820 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx" (OuterVolumeSpecName: "kube-api-access-5msxx") pod "7ae4e019-31b7-4826-a5ef-042faba6034d" (UID: "7ae4e019-31b7-4826-a5ef-042faba6034d"). InnerVolumeSpecName "kube-api-access-5msxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.331734 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.331782 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5msxx\" (UniqueName: \"kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.694419 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2mzpw" event={"ID":"7ae4e019-31b7-4826-a5ef-042faba6034d","Type":"ContainerDied","Data":"203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f"} Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.694468 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.694994 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.696974 4962 generic.go:334] "Generic (PLEG): container finished" podID="4f1374d6-d1c8-4b28-a524-485ced8ec7b9" containerID="ea699931b3b6d1154f7563dc0e7c455f7597120eb0a07bf4c657255d11998dcb" exitCode=0 Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.697010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f1374d6-d1c8-4b28-a524-485ced8ec7b9","Type":"ContainerDied","Data":"ea699931b3b6d1154f7563dc0e7c455f7597120eb0a07bf4c657255d11998dcb"} Feb 20 11:12:44 crc kubenswrapper[4962]: I0220 11:12:44.707463 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f1374d6-d1c8-4b28-a524-485ced8ec7b9","Type":"ContainerStarted","Data":"601e743ff0ed3c58a890cd7e80ea114736992d584eb37952d8b0e4d680f2e7e7"} Feb 20 11:12:44 crc kubenswrapper[4962]: I0220 11:12:44.708575 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:44 crc kubenswrapper[4962]: I0220 11:12:44.733310 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.733285392 podStartE2EDuration="37.733285392s" podCreationTimestamp="2026-02-20 11:12:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:44.727221917 +0000 UTC m=+4656.309693763" watchObservedRunningTime="2026-02-20 11:12:44.733285392 +0000 UTC m=+4656.315757278" Feb 20 11:12:58 crc kubenswrapper[4962]: I0220 11:12:58.472909 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:13:41 crc kubenswrapper[4962]: I0220 11:13:41.508667 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:13:41 crc kubenswrapper[4962]: I0220 11:13:41.510761 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:14:11 crc kubenswrapper[4962]: I0220 11:14:11.507884 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:14:11 crc kubenswrapper[4962]: I0220 11:14:11.508718 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.508472 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.509096 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.509152 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.509786 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.509873 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26" gracePeriod=600 Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.922258 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26" exitCode=0 Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.922338 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26"} Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.922696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae"} Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.922732 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.161950 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn"] Feb 20 11:15:00 crc kubenswrapper[4962]: E0220 11:15:00.163266 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae4e019-31b7-4826-a5ef-042faba6034d" containerName="mariadb-account-create-update" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.163298 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae4e019-31b7-4826-a5ef-042faba6034d" containerName="mariadb-account-create-update" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.165130 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae4e019-31b7-4826-a5ef-042faba6034d" containerName="mariadb-account-create-update" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.166795 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.170569 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.172558 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.193060 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn"] Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.283570 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn5bc\" (UniqueName: \"kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.284100 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.284279 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.385575 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn5bc\" (UniqueName: \"kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.385733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.385860 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.387219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.395923 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.405841 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn5bc\" (UniqueName: \"kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.506294 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.779673 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn"] Feb 20 11:15:01 crc kubenswrapper[4962]: I0220 11:15:01.090585 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" event={"ID":"77a85f36-7811-4d2e-86d2-84c9a8aa1e54","Type":"ContainerStarted","Data":"03c92e9b0e9071dee5a479d5565338cdd2b211bcbc7dfe472fff5850387bc236"} Feb 20 11:15:01 crc kubenswrapper[4962]: I0220 11:15:01.091014 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" event={"ID":"77a85f36-7811-4d2e-86d2-84c9a8aa1e54","Type":"ContainerStarted","Data":"b4791a388c03bb6e76b408164a65c6e1448fd3bc7edcd44a80b231d64bffb4fa"} Feb 20 11:15:02 crc kubenswrapper[4962]: I0220 11:15:02.112960 4962 generic.go:334] "Generic (PLEG): container finished" podID="77a85f36-7811-4d2e-86d2-84c9a8aa1e54" containerID="03c92e9b0e9071dee5a479d5565338cdd2b211bcbc7dfe472fff5850387bc236" exitCode=0 Feb 20 11:15:02 crc kubenswrapper[4962]: I0220 11:15:02.113034 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" event={"ID":"77a85f36-7811-4d2e-86d2-84c9a8aa1e54","Type":"ContainerDied","Data":"03c92e9b0e9071dee5a479d5565338cdd2b211bcbc7dfe472fff5850387bc236"} Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.484167 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.637133 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume\") pod \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.637343 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume\") pod \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.637468 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn5bc\" (UniqueName: \"kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc\") pod \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.638382 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume" (OuterVolumeSpecName: "config-volume") pod "77a85f36-7811-4d2e-86d2-84c9a8aa1e54" (UID: "77a85f36-7811-4d2e-86d2-84c9a8aa1e54"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.643758 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "77a85f36-7811-4d2e-86d2-84c9a8aa1e54" (UID: "77a85f36-7811-4d2e-86d2-84c9a8aa1e54"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.645622 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc" (OuterVolumeSpecName: "kube-api-access-dn5bc") pod "77a85f36-7811-4d2e-86d2-84c9a8aa1e54" (UID: "77a85f36-7811-4d2e-86d2-84c9a8aa1e54"). InnerVolumeSpecName "kube-api-access-dn5bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.740016 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.740072 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn5bc\" (UniqueName: \"kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc\") on node \"crc\" DevicePath \"\"" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.740092 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 11:15:04 crc kubenswrapper[4962]: I0220 11:15:04.136149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" event={"ID":"77a85f36-7811-4d2e-86d2-84c9a8aa1e54","Type":"ContainerDied","Data":"b4791a388c03bb6e76b408164a65c6e1448fd3bc7edcd44a80b231d64bffb4fa"} Feb 20 11:15:04 crc kubenswrapper[4962]: I0220 11:15:04.136193 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4791a388c03bb6e76b408164a65c6e1448fd3bc7edcd44a80b231d64bffb4fa" Feb 20 11:15:04 crc kubenswrapper[4962]: I0220 11:15:04.136278 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:04 crc kubenswrapper[4962]: I0220 11:15:04.597567 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg"] Feb 20 11:15:04 crc kubenswrapper[4962]: I0220 11:15:04.606322 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg"] Feb 20 11:15:05 crc kubenswrapper[4962]: I0220 11:15:05.157555 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a57c99f-e682-43fc-85be-d6ca9b32dd2e" path="/var/lib/kubelet/pods/6a57c99f-e682-43fc-85be-d6ca9b32dd2e/volumes" Feb 20 11:15:17 crc kubenswrapper[4962]: I0220 11:15:17.254751 4962 scope.go:117] "RemoveContainer" containerID="672fb8e4a3790f1f70ac1c9ed16383d55019a5b81bdd7e7049f12caa51ab0535" Feb 20 11:15:17 crc kubenswrapper[4962]: I0220 11:15:17.303023 4962 scope.go:117] "RemoveContainer" containerID="0fe78591e142b00ce0a1305d692098cbd58316c27a1f913e16fe16879c83db51" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.621407 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:00 crc kubenswrapper[4962]: E0220 11:16:00.622688 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a85f36-7811-4d2e-86d2-84c9a8aa1e54" containerName="collect-profiles" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.622713 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a85f36-7811-4d2e-86d2-84c9a8aa1e54" containerName="collect-profiles" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.622971 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a85f36-7811-4d2e-86d2-84c9a8aa1e54" containerName="collect-profiles" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.625028 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.633895 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.791724 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb7wj\" (UniqueName: \"kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.791835 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.791883 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.893182 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb7wj\" (UniqueName: \"kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.893496 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.893582 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.894084 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.896216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.926428 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb7wj\" (UniqueName: \"kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.971621 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:01 crc kubenswrapper[4962]: I0220 11:16:01.428336 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:01 crc kubenswrapper[4962]: I0220 11:16:01.701995 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerStarted","Data":"ea600c71e0345a8043751e8b826aee7f560702337e800b5defa9d6a7fb10b53f"} Feb 20 11:16:02 crc kubenswrapper[4962]: I0220 11:16:02.712494 4962 generic.go:334] "Generic (PLEG): container finished" podID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerID="b4d681ba38ab243d90aebb0cd3f5e0d964a3b05eff1c6a9189b417f8bc499f51" exitCode=0 Feb 20 11:16:02 crc kubenswrapper[4962]: I0220 11:16:02.712677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerDied","Data":"b4d681ba38ab243d90aebb0cd3f5e0d964a3b05eff1c6a9189b417f8bc499f51"} Feb 20 11:16:03 crc kubenswrapper[4962]: I0220 11:16:03.727958 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerStarted","Data":"cc4df7336e0c93c42160fd50ab2c566dcfda96d76ab5ecee6e26256c4e0e35c7"} Feb 20 11:16:04 crc kubenswrapper[4962]: I0220 11:16:04.741811 4962 generic.go:334] "Generic (PLEG): container finished" podID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerID="cc4df7336e0c93c42160fd50ab2c566dcfda96d76ab5ecee6e26256c4e0e35c7" exitCode=0 Feb 20 11:16:04 crc kubenswrapper[4962]: I0220 11:16:04.741934 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerDied","Data":"cc4df7336e0c93c42160fd50ab2c566dcfda96d76ab5ecee6e26256c4e0e35c7"} Feb 20 11:16:05 crc kubenswrapper[4962]: I0220 11:16:05.757718 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerStarted","Data":"1c894dfd10e3ea0973c4a9f38552b1c9dae05591935995fa639e6204d2604dcb"} Feb 20 11:16:05 crc kubenswrapper[4962]: I0220 11:16:05.792078 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c98x6" podStartSLOduration=3.283329453 podStartE2EDuration="5.792052917s" podCreationTimestamp="2026-02-20 11:16:00 +0000 UTC" firstStartedPulling="2026-02-20 11:16:02.714464458 +0000 UTC m=+4854.296936314" lastFinishedPulling="2026-02-20 11:16:05.223187902 +0000 UTC m=+4856.805659778" observedRunningTime="2026-02-20 11:16:05.787823218 +0000 UTC m=+4857.370295104" watchObservedRunningTime="2026-02-20 11:16:05.792052917 +0000 UTC m=+4857.374524803" Feb 20 11:16:10 crc kubenswrapper[4962]: I0220 11:16:10.972844 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:10 crc kubenswrapper[4962]: I0220 11:16:10.973481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:11 crc kubenswrapper[4962]: I0220 11:16:11.056948 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:11 crc kubenswrapper[4962]: I0220 11:16:11.893743 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:11 crc kubenswrapper[4962]: I0220 11:16:11.940388 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:13 crc kubenswrapper[4962]: I0220 11:16:13.843312 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c98x6" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="registry-server" containerID="cri-o://1c894dfd10e3ea0973c4a9f38552b1c9dae05591935995fa639e6204d2604dcb" gracePeriod=2 Feb 20 11:16:14 crc kubenswrapper[4962]: I0220 11:16:14.855342 4962 generic.go:334] "Generic (PLEG): container finished" podID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerID="1c894dfd10e3ea0973c4a9f38552b1c9dae05591935995fa639e6204d2604dcb" exitCode=0 Feb 20 11:16:14 crc kubenswrapper[4962]: I0220 11:16:14.855910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerDied","Data":"1c894dfd10e3ea0973c4a9f38552b1c9dae05591935995fa639e6204d2604dcb"} Feb 20 11:16:14 crc kubenswrapper[4962]: I0220 11:16:14.857004 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerDied","Data":"ea600c71e0345a8043751e8b826aee7f560702337e800b5defa9d6a7fb10b53f"} Feb 20 11:16:14 crc kubenswrapper[4962]: I0220 11:16:14.857078 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea600c71e0345a8043751e8b826aee7f560702337e800b5defa9d6a7fb10b53f" Feb 20 11:16:14 crc kubenswrapper[4962]: I0220 11:16:14.902942 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.046501 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities\") pod \"f80e860a-4910-4de7-9daf-3ecf4808b002\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.046685 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb7wj\" (UniqueName: \"kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj\") pod \"f80e860a-4910-4de7-9daf-3ecf4808b002\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.046716 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content\") pod \"f80e860a-4910-4de7-9daf-3ecf4808b002\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.047931 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities" (OuterVolumeSpecName: "utilities") pod "f80e860a-4910-4de7-9daf-3ecf4808b002" (UID: "f80e860a-4910-4de7-9daf-3ecf4808b002"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.055152 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj" (OuterVolumeSpecName: "kube-api-access-hb7wj") pod "f80e860a-4910-4de7-9daf-3ecf4808b002" (UID: "f80e860a-4910-4de7-9daf-3ecf4808b002"). InnerVolumeSpecName "kube-api-access-hb7wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.136880 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f80e860a-4910-4de7-9daf-3ecf4808b002" (UID: "f80e860a-4910-4de7-9daf-3ecf4808b002"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.148883 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb7wj\" (UniqueName: \"kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj\") on node \"crc\" DevicePath \"\"" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.148951 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.148977 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.867858 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.933673 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.946069 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:17 crc kubenswrapper[4962]: I0220 11:16:17.151912 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" path="/var/lib/kubelet/pods/f80e860a-4910-4de7-9daf-3ecf4808b002/volumes" Feb 20 11:16:41 crc kubenswrapper[4962]: I0220 11:16:41.508014 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:16:41 crc kubenswrapper[4962]: I0220 11:16:41.508725 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:17:11 crc kubenswrapper[4962]: I0220 11:17:11.508478 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:17:11 crc kubenswrapper[4962]: I0220 11:17:11.509216 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.508138 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.508953 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.509021 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.509962 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.510085 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" gracePeriod=600 Feb 20 11:17:41 crc kubenswrapper[4962]: E0220 11:17:41.641461 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.675902 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" exitCode=0 Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.675980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae"} Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.676033 4962 scope.go:117] "RemoveContainer" containerID="dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.676796 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:17:41 crc kubenswrapper[4962]: E0220 11:17:41.677173 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:17:52 crc kubenswrapper[4962]: I0220 11:17:52.138645 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:17:52 crc kubenswrapper[4962]: E0220 11:17:52.139320 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.882671 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:17:53 crc kubenswrapper[4962]: E0220 11:17:53.883421 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="extract-content" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.883443 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="extract-content" Feb 20 11:17:53 crc kubenswrapper[4962]: E0220 11:17:53.883648 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="registry-server" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.883717 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="registry-server" Feb 20 11:17:53 crc kubenswrapper[4962]: E0220 11:17:53.883780 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="extract-utilities" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.883795 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="extract-utilities" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.884044 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="registry-server" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.885789 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.897354 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.966223 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgvqg\" (UniqueName: \"kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.966283 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.966655 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.067887 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.067942 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgvqg\" (UniqueName: \"kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.067964 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.068416 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.068896 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.101102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgvqg\" (UniqueName: \"kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.210862 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.757052 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:17:55 crc kubenswrapper[4962]: W0220 11:17:55.149818 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod839b832b_5799_4bcf_b028_f6d138668d44.slice/crio-45aebe52af9db615e922a96f885c6199b792863852c9fdc58224e9b5a8d28b61 WatchSource:0}: Error finding container 45aebe52af9db615e922a96f885c6199b792863852c9fdc58224e9b5a8d28b61: Status 404 returned error can't find the container with id 45aebe52af9db615e922a96f885c6199b792863852c9fdc58224e9b5a8d28b61 Feb 20 11:17:55 crc kubenswrapper[4962]: I0220 11:17:55.821466 4962 generic.go:334] "Generic (PLEG): container finished" podID="839b832b-5799-4bcf-b028-f6d138668d44" containerID="d02121a7865602e447d9672ca12527892a8b4df01784c7fe6472f665dc92d541" exitCode=0 Feb 20 11:17:55 crc kubenswrapper[4962]: I0220 11:17:55.821890 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerDied","Data":"d02121a7865602e447d9672ca12527892a8b4df01784c7fe6472f665dc92d541"} Feb 20 11:17:55 crc kubenswrapper[4962]: I0220 11:17:55.821928 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerStarted","Data":"45aebe52af9db615e922a96f885c6199b792863852c9fdc58224e9b5a8d28b61"} Feb 20 11:17:55 crc kubenswrapper[4962]: I0220 11:17:55.823867 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 11:17:56 crc kubenswrapper[4962]: I0220 11:17:56.837804 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerStarted","Data":"b6401eca9c5ccd1616136765494ee1804d00cfe7d4b24f779b2d66665195c8ff"} Feb 20 11:17:57 crc kubenswrapper[4962]: I0220 11:17:57.850667 4962 generic.go:334] "Generic (PLEG): container finished" podID="839b832b-5799-4bcf-b028-f6d138668d44" containerID="b6401eca9c5ccd1616136765494ee1804d00cfe7d4b24f779b2d66665195c8ff" exitCode=0 Feb 20 11:17:57 crc kubenswrapper[4962]: I0220 11:17:57.850740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerDied","Data":"b6401eca9c5ccd1616136765494ee1804d00cfe7d4b24f779b2d66665195c8ff"} Feb 20 11:17:58 crc kubenswrapper[4962]: I0220 11:17:58.861454 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerStarted","Data":"ba978c4b17a448437f2e3d666913798190e79acd9116a9903a0378029be2493e"} Feb 20 11:17:58 crc kubenswrapper[4962]: I0220 11:17:58.900510 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8zmzk" podStartSLOduration=3.452509998 podStartE2EDuration="5.900480737s" podCreationTimestamp="2026-02-20 11:17:53 +0000 UTC" firstStartedPulling="2026-02-20 11:17:55.823480165 +0000 UTC m=+4967.405952041" lastFinishedPulling="2026-02-20 11:17:58.271450894 +0000 UTC m=+4969.853922780" observedRunningTime="2026-02-20 11:17:58.89399599 +0000 UTC m=+4970.476467886" watchObservedRunningTime="2026-02-20 11:17:58.900480737 +0000 UTC m=+4970.482952593" Feb 20 11:18:04 crc kubenswrapper[4962]: I0220 11:18:04.211236 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:04 crc kubenswrapper[4962]: I0220 11:18:04.212466 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:04 crc kubenswrapper[4962]: I0220 11:18:04.287013 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:05 crc kubenswrapper[4962]: I0220 11:18:05.010910 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:05 crc kubenswrapper[4962]: I0220 11:18:05.091166 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:18:05 crc kubenswrapper[4962]: I0220 11:18:05.139935 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:18:05 crc kubenswrapper[4962]: E0220 11:18:05.140342 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:18:06 crc kubenswrapper[4962]: I0220 11:18:06.950303 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8zmzk" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="registry-server" containerID="cri-o://ba978c4b17a448437f2e3d666913798190e79acd9116a9903a0378029be2493e" gracePeriod=2 Feb 20 11:18:07 crc kubenswrapper[4962]: I0220 11:18:07.964026 4962 generic.go:334] "Generic (PLEG): container finished" podID="839b832b-5799-4bcf-b028-f6d138668d44" containerID="ba978c4b17a448437f2e3d666913798190e79acd9116a9903a0378029be2493e" exitCode=0 Feb 20 11:18:07 crc kubenswrapper[4962]: I0220 11:18:07.964094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerDied","Data":"ba978c4b17a448437f2e3d666913798190e79acd9116a9903a0378029be2493e"} Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.560021 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.728077 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities\") pod \"839b832b-5799-4bcf-b028-f6d138668d44\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.728159 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content\") pod \"839b832b-5799-4bcf-b028-f6d138668d44\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.728260 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgvqg\" (UniqueName: \"kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg\") pod \"839b832b-5799-4bcf-b028-f6d138668d44\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.729738 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities" (OuterVolumeSpecName: "utilities") pod "839b832b-5799-4bcf-b028-f6d138668d44" (UID: "839b832b-5799-4bcf-b028-f6d138668d44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.737354 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg" (OuterVolumeSpecName: "kube-api-access-rgvqg") pod "839b832b-5799-4bcf-b028-f6d138668d44" (UID: "839b832b-5799-4bcf-b028-f6d138668d44"). InnerVolumeSpecName "kube-api-access-rgvqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.774630 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "839b832b-5799-4bcf-b028-f6d138668d44" (UID: "839b832b-5799-4bcf-b028-f6d138668d44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.829730 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.829785 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.829813 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgvqg\" (UniqueName: \"kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg\") on node \"crc\" DevicePath \"\"" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.978479 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerDied","Data":"45aebe52af9db615e922a96f885c6199b792863852c9fdc58224e9b5a8d28b61"} Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.978567 4962 scope.go:117] "RemoveContainer" containerID="ba978c4b17a448437f2e3d666913798190e79acd9116a9903a0378029be2493e" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.978568 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:09 crc kubenswrapper[4962]: I0220 11:18:09.017183 4962 scope.go:117] "RemoveContainer" containerID="b6401eca9c5ccd1616136765494ee1804d00cfe7d4b24f779b2d66665195c8ff" Feb 20 11:18:09 crc kubenswrapper[4962]: I0220 11:18:09.043377 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:18:09 crc kubenswrapper[4962]: I0220 11:18:09.055412 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:18:09 crc kubenswrapper[4962]: I0220 11:18:09.056493 4962 scope.go:117] "RemoveContainer" containerID="d02121a7865602e447d9672ca12527892a8b4df01784c7fe6472f665dc92d541" Feb 20 11:18:09 crc kubenswrapper[4962]: I0220 11:18:09.156731 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="839b832b-5799-4bcf-b028-f6d138668d44" path="/var/lib/kubelet/pods/839b832b-5799-4bcf-b028-f6d138668d44/volumes" Feb 20 11:18:17 crc kubenswrapper[4962]: I0220 11:18:17.433326 4962 scope.go:117] "RemoveContainer" containerID="4ccbd6d45940c4b1ae7e0e1f68c265065827d64d2c523d3f5bea75e59a5d57b0" Feb 20 11:18:17 crc kubenswrapper[4962]: I0220 11:18:17.465468 4962 scope.go:117] "RemoveContainer" containerID="9ee74111c1ba86e5709005fcbe78e4bf5aa89be27bca01ef0b469ef1b5c60efd" Feb 20 11:18:17 crc kubenswrapper[4962]: I0220 11:18:17.493102 4962 scope.go:117] "RemoveContainer" containerID="a2c53e38bc22bb8389498260263ad69325ab45f3c976482fce0fcee721e543fa" Feb 20 11:18:18 crc kubenswrapper[4962]: I0220 11:18:18.139730 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:18:18 crc kubenswrapper[4962]: E0220 11:18:18.140162 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.418064 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4hjrx/must-gather-vfvhw"] Feb 20 11:18:26 crc kubenswrapper[4962]: E0220 11:18:26.418727 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="extract-utilities" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.418739 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="extract-utilities" Feb 20 11:18:26 crc kubenswrapper[4962]: E0220 11:18:26.418760 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="registry-server" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.418766 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="registry-server" Feb 20 11:18:26 crc kubenswrapper[4962]: E0220 11:18:26.418781 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="extract-content" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.418787 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="extract-content" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.418907 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="registry-server" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.419587 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.420980 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4hjrx"/"default-dockercfg-xdbvn" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.421439 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4hjrx"/"openshift-service-ca.crt" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.421519 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4hjrx"/"kube-root-ca.crt" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.469033 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4hjrx/must-gather-vfvhw"] Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.562909 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.562987 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xc68\" (UniqueName: \"kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.664396 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xc68\" (UniqueName: \"kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.664501 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.664877 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.684209 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xc68\" (UniqueName: \"kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.733007 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:27 crc kubenswrapper[4962]: I0220 11:18:27.011776 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4hjrx/must-gather-vfvhw"] Feb 20 11:18:27 crc kubenswrapper[4962]: I0220 11:18:27.158449 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" event={"ID":"fd1e5da5-b553-419b-b874-baa4d0b09f1d","Type":"ContainerStarted","Data":"c6cd07a28d727227c91863658be837f8ea3bd5357196bf48a15bd86c017219d9"} Feb 20 11:18:30 crc kubenswrapper[4962]: I0220 11:18:30.139506 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:18:30 crc kubenswrapper[4962]: E0220 11:18:30.140493 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:18:34 crc kubenswrapper[4962]: I0220 11:18:34.239555 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" event={"ID":"fd1e5da5-b553-419b-b874-baa4d0b09f1d","Type":"ContainerStarted","Data":"51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be"} Feb 20 11:18:34 crc kubenswrapper[4962]: I0220 11:18:34.241075 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" event={"ID":"fd1e5da5-b553-419b-b874-baa4d0b09f1d","Type":"ContainerStarted","Data":"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592"} Feb 20 11:18:34 crc kubenswrapper[4962]: I0220 11:18:34.260189 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" podStartSLOduration=1.741347285 podStartE2EDuration="8.260166537s" podCreationTimestamp="2026-02-20 11:18:26 +0000 UTC" firstStartedPulling="2026-02-20 11:18:27.013141892 +0000 UTC m=+4998.595613738" lastFinishedPulling="2026-02-20 11:18:33.531961144 +0000 UTC m=+5005.114432990" observedRunningTime="2026-02-20 11:18:34.254900477 +0000 UTC m=+5005.837372343" watchObservedRunningTime="2026-02-20 11:18:34.260166537 +0000 UTC m=+5005.842638423" Feb 20 11:18:44 crc kubenswrapper[4962]: I0220 11:18:44.138888 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:18:44 crc kubenswrapper[4962]: E0220 11:18:44.140948 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:18:55 crc kubenswrapper[4962]: I0220 11:18:55.138743 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:18:55 crc kubenswrapper[4962]: E0220 11:18:55.141510 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:19:09 crc kubenswrapper[4962]: I0220 11:19:09.149364 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:19:09 crc kubenswrapper[4962]: E0220 11:19:09.150313 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:19:17 crc kubenswrapper[4962]: I0220 11:19:17.620202 4962 scope.go:117] "RemoveContainer" containerID="f56f77a29f18b053790ac8a764373585312883ee763879c9fe012ee4ec5c65e1" Feb 20 11:19:20 crc kubenswrapper[4962]: I0220 11:19:20.139918 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:19:20 crc kubenswrapper[4962]: E0220 11:19:20.140903 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.335845 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-589cf688cc-62bkq_10021bed-f80b-491c-8326-88df1a07c1f7/init/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.486648 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-589cf688cc-62bkq_10021bed-f80b-491c-8326-88df1a07c1f7/init/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.507109 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-589cf688cc-62bkq_10021bed-f80b-491c-8326-88df1a07c1f7/dnsmasq-dns/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.664290 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8374d0f9-f4be-4f6b-88eb-4849a2be49e9/memcached/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.707852 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_97ae547e-e977-4b15-a979-38415ee77885/mysql-bootstrap/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.881534 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_97ae547e-e977-4b15-a979-38415ee77885/galera/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.914201 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_97ae547e-e977-4b15-a979-38415ee77885/mysql-bootstrap/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.946209 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f2ffa3bc-ffbe-4a42-b14f-48aa20546210/mysql-bootstrap/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.117801 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4f1374d6-d1c8-4b28-a524-485ced8ec7b9/setup-container/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.121556 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f2ffa3bc-ffbe-4a42-b14f-48aa20546210/galera/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.139056 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f2ffa3bc-ffbe-4a42-b14f-48aa20546210/mysql-bootstrap/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.359992 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4f1374d6-d1c8-4b28-a524-485ced8ec7b9/setup-container/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.379575 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4f1374d6-d1c8-4b28-a524-485ced8ec7b9/rabbitmq/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.433708 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-2mzpw_7ae4e019-31b7-4826-a5ef-042faba6034d/mariadb-account-create-update/0.log" Feb 20 11:19:31 crc kubenswrapper[4962]: I0220 11:19:31.138827 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:19:31 crc kubenswrapper[4962]: E0220 11:19:31.139426 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:19:44 crc kubenswrapper[4962]: I0220 11:19:44.139155 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:19:44 crc kubenswrapper[4962]: E0220 11:19:44.140036 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.519319 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/util/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.685363 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/util/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.730504 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/pull/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.750327 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/pull/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.928518 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/extract/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.939635 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/util/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.980765 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/pull/0.log" Feb 20 11:19:49 crc kubenswrapper[4962]: I0220 11:19:49.340514 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-r2t72_cf0e10ba-c175-44c3-9011-6646f21ba334/manager/0.log" Feb 20 11:19:49 crc kubenswrapper[4962]: I0220 11:19:49.638340 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-wcqzf_ea986843-26e4-4410-a65e-ae51c02dc04c/manager/0.log" Feb 20 11:19:49 crc kubenswrapper[4962]: I0220 11:19:49.809437 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-75vx4_fee6970c-0ad7-46ea-ab75-dcb7d552ffbb/manager/0.log" Feb 20 11:19:49 crc kubenswrapper[4962]: I0220 11:19:49.995812 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-rhhc7_12f33757-f329-47a6-9273-bdeb1558a4d7/manager/0.log" Feb 20 11:19:50 crc kubenswrapper[4962]: I0220 11:19:50.499936 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-2hg4n_5fec06f1-8ccf-403c-88de-2b581f056802/manager/0.log" Feb 20 11:19:50 crc kubenswrapper[4962]: I0220 11:19:50.580542 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-4rnhn_0c8c62e9-0201-43a4-b823-82af87a0977e/manager/0.log" Feb 20 11:19:51 crc kubenswrapper[4962]: I0220 11:19:51.013799 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-jjbwt_7afb870a-75a4-42d5-9704-5cef14dd3ce9/manager/0.log" Feb 20 11:19:51 crc kubenswrapper[4962]: I0220 11:19:51.231580 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-6lvhz_a9979be5-6650-425b-a748-51e2cb552413/manager/0.log" Feb 20 11:19:51 crc kubenswrapper[4962]: I0220 11:19:51.473001 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-wn92v_f8f1dca9-8b83-469d-b834-3f11376576c9/manager/0.log" Feb 20 11:19:51 crc kubenswrapper[4962]: I0220 11:19:51.482994 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-bsq9n_ac33f7ed-c3f8-487d-89dc-4a614d357b86/manager/0.log" Feb 20 11:19:51 crc kubenswrapper[4962]: I0220 11:19:51.695457 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-knwp9_6fdeab3e-de35-4d69-9e67-e5d8257bc25d/manager/0.log" Feb 20 11:19:52 crc kubenswrapper[4962]: I0220 11:19:52.228025 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-d2clq_4e2614ed-ea7a-430e-af7b-4d66f05f7b96/manager/0.log" Feb 20 11:19:52 crc kubenswrapper[4962]: I0220 11:19:52.408844 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf_f8de466d-f069-4a8e-8598-72a163525c24/manager/0.log" Feb 20 11:19:52 crc kubenswrapper[4962]: I0220 11:19:52.776326 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6679bf9b57-n5hm2_ad363690-9ad6-4f45-ac02-d51ec41d213b/operator/0.log" Feb 20 11:19:53 crc kubenswrapper[4962]: I0220 11:19:53.047214 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-t9zxk_46f437ac-c97a-4af9-92e7-6bec63b7d8d8/registry-server/0.log" Feb 20 11:19:53 crc kubenswrapper[4962]: I0220 11:19:53.518858 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-nlq5k_72728d52-a8e9-4689-8da0-871f250f7664/manager/0.log" Feb 20 11:19:53 crc kubenswrapper[4962]: I0220 11:19:53.654498 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-x4gh4_34cb38e0-7c0a-4f00-89e9-9be7b394585d/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.005711 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-69ff7bc449-rqmzz_98bbcdbd-382d-48ca-aa14-3e9ba4b63c98/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.014108 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5mrjv_5691d6ef-dedb-4a46-a1b6-0435e9f6db0a/operator/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.190028 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-9pxbg_4a325f02-ddda-49e9-9ef0-40fd4726b09f/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.345566 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-mfpm9_7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.421096 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-lxl4x_32d42cbd-4ea1-49cc-b9d4-33fe5f655a16/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.615032 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-kthxs_4c8bff11-1a85-4f9b-8fb2-defd04ac22d1/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.836986 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-ln4sp_14efe385-5147-49ed-a42f-804b91438a55/manager/0.log" Feb 20 11:19:57 crc kubenswrapper[4962]: I0220 11:19:57.138138 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:19:57 crc kubenswrapper[4962]: E0220 11:19:57.138342 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:20:00 crc kubenswrapper[4962]: I0220 11:20:00.604235 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-nhpg5_e0560856-ed00-4ea8-8ce7-a801f1d46489/manager/0.log" Feb 20 11:20:10 crc kubenswrapper[4962]: I0220 11:20:10.139058 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:20:10 crc kubenswrapper[4962]: E0220 11:20:10.140142 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:20:17 crc kubenswrapper[4962]: I0220 11:20:17.252345 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hc9h5_75c3ba8d-4548-4407-9188-a785ef05da2c/control-plane-machine-set-operator/0.log" Feb 20 11:20:17 crc kubenswrapper[4962]: I0220 11:20:17.355986 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ckmh2_a7a9fa76-da75-4847-a539-d1e6bb57da98/kube-rbac-proxy/0.log" Feb 20 11:20:17 crc kubenswrapper[4962]: I0220 11:20:17.415774 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ckmh2_a7a9fa76-da75-4847-a539-d1e6bb57da98/machine-api-operator/0.log" Feb 20 11:20:24 crc kubenswrapper[4962]: I0220 11:20:24.139802 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:20:24 crc kubenswrapper[4962]: E0220 11:20:24.141226 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:20:31 crc kubenswrapper[4962]: I0220 11:20:31.741102 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-ctc7p_41c6ef1c-4069-44b1-a0ba-de5e820a630c/cert-manager-controller/0.log" Feb 20 11:20:31 crc kubenswrapper[4962]: I0220 11:20:31.900931 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-lvkdh_4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3/cert-manager-cainjector/0.log" Feb 20 11:20:31 crc kubenswrapper[4962]: I0220 11:20:31.945856 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-t5nv4_0d86f751-d081-47b7-a623-a9cc14ab43f7/cert-manager-webhook/0.log" Feb 20 11:20:37 crc kubenswrapper[4962]: I0220 11:20:37.139103 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:20:37 crc kubenswrapper[4962]: E0220 11:20:37.140133 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.403981 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-2hqd7_e17e90c9-fe19-4544-9a79-bffc8072a763/nmstate-console-plugin/0.log" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.602101 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-frtsf_5056ae4f-c2f7-41f5-8e12-b7b5d8996852/nmstate-handler/0.log" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.747111 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-6x8wh_edcc687e-09ef-4048-8db7-d67e6fe23212/nmstate-metrics/0.log" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.774517 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-6x8wh_edcc687e-09ef-4048-8db7-d67e6fe23212/kube-rbac-proxy/0.log" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.913011 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-nkzm2_cffc71cf-18b7-4733-b863-19b8664b5cf4/nmstate-operator/0.log" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.983980 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-l2lqb_a453e12b-e95c-4c04-b67b-b5bc6527a3ab/nmstate-webhook/0.log" Feb 20 11:20:52 crc kubenswrapper[4962]: I0220 11:20:52.139211 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:20:52 crc kubenswrapper[4962]: E0220 11:20:52.140050 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:21:06 crc kubenswrapper[4962]: I0220 11:21:06.140110 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:21:06 crc kubenswrapper[4962]: E0220 11:21:06.141077 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:21:18 crc kubenswrapper[4962]: I0220 11:21:18.867139 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-29wdn_7af7ee52-8865-48ce-85e5-7b62fb0d67d3/kube-rbac-proxy/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.176373 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-hb87m_7135845d-f595-42df-9773-7701c9a0b2e2/frr-k8s-webhook-server/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.214649 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-29wdn_7af7ee52-8865-48ce-85e5-7b62fb0d67d3/controller/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.313191 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-frr-files/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.486062 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-reloader/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.487782 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-metrics/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.498280 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-frr-files/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.536799 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-reloader/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.675153 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-frr-files/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.707984 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-metrics/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.708679 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-metrics/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.742311 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-reloader/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.886335 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-frr-files/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.897317 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/controller/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.909906 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-reloader/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.910796 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-metrics/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.130873 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/frr-metrics/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.137390 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/kube-rbac-proxy-frr/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.137824 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/kube-rbac-proxy/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.138178 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:21:20 crc kubenswrapper[4962]: E0220 11:21:20.138366 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.301997 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/reloader/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.351448 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7964458f8b-6fxbj_403ba47d-bbe1-48f6-9382-47f12bbb75ae/manager/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.526254 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79fb478cb4-wmzpd_2ae49f4e-271b-40e8-9cfc-9857fc2de6f3/webhook-server/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.732655 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rx2lw_c8b5efc7-c8c4-4492-a8a9-31eaecfa8374/kube-rbac-proxy/0.log" Feb 20 11:21:21 crc kubenswrapper[4962]: I0220 11:21:21.099087 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rx2lw_c8b5efc7-c8c4-4492-a8a9-31eaecfa8374/speaker/0.log" Feb 20 11:21:21 crc kubenswrapper[4962]: I0220 11:21:21.344636 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/frr/0.log" Feb 20 11:21:34 crc kubenswrapper[4962]: I0220 11:21:34.138896 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:21:34 crc kubenswrapper[4962]: E0220 11:21:34.139909 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.066450 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/util/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.214077 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/util/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.250376 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/pull/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.286526 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/pull/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.439114 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/util/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.453750 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/extract/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.529322 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/pull/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.599079 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/util/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.784996 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/pull/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.807507 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/util/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.840190 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/pull/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.978214 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/extract/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.002529 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/util/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.017363 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/pull/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.151999 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-utilities/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.308866 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-utilities/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.355329 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-content/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.379290 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-content/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.536783 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-content/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.542496 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-utilities/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.740058 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-utilities/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.748365 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/registry-server/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.876684 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-content/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.913986 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-content/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.922497 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-utilities/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.098614 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-utilities/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.108311 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-content/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.287367 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/util/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.525925 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/pull/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.572358 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/util/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.645555 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/pull/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.751502 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/util/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.775465 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/pull/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.806691 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/registry-server/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.835036 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/extract/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.925933 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-mzhb4_34e2f7a3-366d-4817-a502-720b5f9a782e/marketplace-operator/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.002610 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-utilities/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.151738 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-utilities/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.166716 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-content/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.204011 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-content/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.358145 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-content/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.372929 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-utilities/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.545332 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-utilities/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.546589 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/registry-server/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.560974 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.562339 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.572498 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.687950 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sb4j\" (UniqueName: \"kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.687996 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.688221 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.789866 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.789936 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sb4j\" (UniqueName: \"kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.789962 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.790401 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.790443 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.803855 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-content/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.805514 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-content/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.809655 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sb4j\" (UniqueName: \"kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.812203 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-utilities/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.914276 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.003258 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-utilities/0.log" Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.028151 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-content/0.log" Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.392827 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.677156 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc360947-b1e2-4ac0-8447-c7e886e036a0" containerID="e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225" exitCode=0 Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.677197 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerDied","Data":"e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225"} Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.677221 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerStarted","Data":"b1094c83d369f3971efc0ed7014799eaab5e95ee8bff0716f242a67ae96948ae"} Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.714188 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/registry-server/0.log" Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.767582 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.771808 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.783225 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.922135 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82xw6\" (UniqueName: \"kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.922176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.922195 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.023711 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82xw6\" (UniqueName: \"kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.023752 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.023766 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.024229 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.024233 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.044997 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82xw6\" (UniqueName: \"kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.150774 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.608526 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.699162 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc360947-b1e2-4ac0-8447-c7e886e036a0" containerID="221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38" exitCode=0 Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.699242 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerDied","Data":"221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38"} Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.700574 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerStarted","Data":"a4efcf7fb75161d9e5487760d1e0134b75e87ef06f4fca980c54ba2517209850"} Feb 20 11:21:42 crc kubenswrapper[4962]: I0220 11:21:42.711617 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerStarted","Data":"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c"} Feb 20 11:21:42 crc kubenswrapper[4962]: I0220 11:21:42.713176 4962 generic.go:334] "Generic (PLEG): container finished" podID="49deb74e-5930-4583-9544-bbc0c34723d6" containerID="7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2" exitCode=0 Feb 20 11:21:42 crc kubenswrapper[4962]: I0220 11:21:42.713213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerDied","Data":"7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2"} Feb 20 11:21:42 crc kubenswrapper[4962]: I0220 11:21:42.746361 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ftsbz" podStartSLOduration=2.279025758 podStartE2EDuration="4.746344955s" podCreationTimestamp="2026-02-20 11:21:38 +0000 UTC" firstStartedPulling="2026-02-20 11:21:39.679223504 +0000 UTC m=+5191.261695350" lastFinishedPulling="2026-02-20 11:21:42.146542701 +0000 UTC m=+5193.729014547" observedRunningTime="2026-02-20 11:21:42.743479268 +0000 UTC m=+5194.325951134" watchObservedRunningTime="2026-02-20 11:21:42.746344955 +0000 UTC m=+5194.328816801" Feb 20 11:21:43 crc kubenswrapper[4962]: I0220 11:21:43.723695 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerStarted","Data":"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7"} Feb 20 11:21:44 crc kubenswrapper[4962]: I0220 11:21:44.736270 4962 generic.go:334] "Generic (PLEG): container finished" podID="49deb74e-5930-4583-9544-bbc0c34723d6" containerID="f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7" exitCode=0 Feb 20 11:21:44 crc kubenswrapper[4962]: I0220 11:21:44.736395 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerDied","Data":"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7"} Feb 20 11:21:45 crc kubenswrapper[4962]: I0220 11:21:45.754262 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerStarted","Data":"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4"} Feb 20 11:21:45 crc kubenswrapper[4962]: I0220 11:21:45.778086 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-94hk8" podStartSLOduration=3.251112895 podStartE2EDuration="5.778069938s" podCreationTimestamp="2026-02-20 11:21:40 +0000 UTC" firstStartedPulling="2026-02-20 11:21:42.715468405 +0000 UTC m=+5194.297940251" lastFinishedPulling="2026-02-20 11:21:45.242425418 +0000 UTC m=+5196.824897294" observedRunningTime="2026-02-20 11:21:45.776438219 +0000 UTC m=+5197.358910075" watchObservedRunningTime="2026-02-20 11:21:45.778069938 +0000 UTC m=+5197.360541784" Feb 20 11:21:48 crc kubenswrapper[4962]: I0220 11:21:48.138915 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:21:48 crc kubenswrapper[4962]: E0220 11:21:48.139229 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:21:48 crc kubenswrapper[4962]: I0220 11:21:48.915411 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:48 crc kubenswrapper[4962]: I0220 11:21:48.917136 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:49 crc kubenswrapper[4962]: I0220 11:21:49.980169 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ftsbz" podUID="fc360947-b1e2-4ac0-8447-c7e886e036a0" containerName="registry-server" probeResult="failure" output=< Feb 20 11:21:49 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 11:21:49 crc kubenswrapper[4962]: > Feb 20 11:21:51 crc kubenswrapper[4962]: I0220 11:21:51.156213 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:51 crc kubenswrapper[4962]: I0220 11:21:51.156714 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:51 crc kubenswrapper[4962]: I0220 11:21:51.228238 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:51 crc kubenswrapper[4962]: I0220 11:21:51.883529 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:51 crc kubenswrapper[4962]: I0220 11:21:51.944220 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:53 crc kubenswrapper[4962]: I0220 11:21:53.836777 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-94hk8" podUID="49deb74e-5930-4583-9544-bbc0c34723d6" containerName="registry-server" containerID="cri-o://4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4" gracePeriod=2 Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.801314 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.844255 4962 generic.go:334] "Generic (PLEG): container finished" podID="49deb74e-5930-4583-9544-bbc0c34723d6" containerID="4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4" exitCode=0 Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.844302 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerDied","Data":"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4"} Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.844331 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerDied","Data":"a4efcf7fb75161d9e5487760d1e0134b75e87ef06f4fca980c54ba2517209850"} Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.844368 4962 scope.go:117] "RemoveContainer" containerID="4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.844497 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.860016 4962 scope.go:117] "RemoveContainer" containerID="f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.877030 4962 scope.go:117] "RemoveContainer" containerID="7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.899565 4962 scope.go:117] "RemoveContainer" containerID="4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4" Feb 20 11:21:54 crc kubenswrapper[4962]: E0220 11:21:54.900036 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4\": container with ID starting with 4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4 not found: ID does not exist" containerID="4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.900099 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4"} err="failed to get container status \"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4\": rpc error: code = NotFound desc = could not find container \"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4\": container with ID starting with 4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4 not found: ID does not exist" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.900139 4962 scope.go:117] "RemoveContainer" containerID="f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7" Feb 20 11:21:54 crc kubenswrapper[4962]: E0220 11:21:54.900418 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7\": container with ID starting with f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7 not found: ID does not exist" containerID="f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.900445 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7"} err="failed to get container status \"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7\": rpc error: code = NotFound desc = could not find container \"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7\": container with ID starting with f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7 not found: ID does not exist" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.900465 4962 scope.go:117] "RemoveContainer" containerID="7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2" Feb 20 11:21:54 crc kubenswrapper[4962]: E0220 11:21:54.902248 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2\": container with ID starting with 7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2 not found: ID does not exist" containerID="7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.902287 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2"} err="failed to get container status \"7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2\": rpc error: code = NotFound desc = could not find container \"7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2\": container with ID starting with 7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2 not found: ID does not exist" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.983453 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82xw6\" (UniqueName: \"kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6\") pod \"49deb74e-5930-4583-9544-bbc0c34723d6\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.983521 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content\") pod \"49deb74e-5930-4583-9544-bbc0c34723d6\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.983634 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities\") pod \"49deb74e-5930-4583-9544-bbc0c34723d6\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.985015 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities" (OuterVolumeSpecName: "utilities") pod "49deb74e-5930-4583-9544-bbc0c34723d6" (UID: "49deb74e-5930-4583-9544-bbc0c34723d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.997773 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6" (OuterVolumeSpecName: "kube-api-access-82xw6") pod "49deb74e-5930-4583-9544-bbc0c34723d6" (UID: "49deb74e-5930-4583-9544-bbc0c34723d6"). InnerVolumeSpecName "kube-api-access-82xw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.045105 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49deb74e-5930-4583-9544-bbc0c34723d6" (UID: "49deb74e-5930-4583-9544-bbc0c34723d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.086079 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.086129 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.086151 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82xw6\" (UniqueName: \"kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6\") on node \"crc\" DevicePath \"\"" Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.196186 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.203304 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:57 crc kubenswrapper[4962]: I0220 11:21:57.152282 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49deb74e-5930-4583-9544-bbc0c34723d6" path="/var/lib/kubelet/pods/49deb74e-5930-4583-9544-bbc0c34723d6/volumes" Feb 20 11:21:58 crc kubenswrapper[4962]: I0220 11:21:58.992897 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:59 crc kubenswrapper[4962]: I0220 11:21:59.067946 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:59 crc kubenswrapper[4962]: I0220 11:21:59.250566 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:22:00 crc kubenswrapper[4962]: I0220 11:22:00.893816 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ftsbz" podUID="fc360947-b1e2-4ac0-8447-c7e886e036a0" containerName="registry-server" containerID="cri-o://e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c" gracePeriod=2 Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.139410 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:22:01 crc kubenswrapper[4962]: E0220 11:22:01.139788 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.421143 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.594125 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sb4j\" (UniqueName: \"kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j\") pod \"fc360947-b1e2-4ac0-8447-c7e886e036a0\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.594175 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content\") pod \"fc360947-b1e2-4ac0-8447-c7e886e036a0\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.594229 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities\") pod \"fc360947-b1e2-4ac0-8447-c7e886e036a0\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.595139 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities" (OuterVolumeSpecName: "utilities") pod "fc360947-b1e2-4ac0-8447-c7e886e036a0" (UID: "fc360947-b1e2-4ac0-8447-c7e886e036a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.607788 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j" (OuterVolumeSpecName: "kube-api-access-2sb4j") pod "fc360947-b1e2-4ac0-8447-c7e886e036a0" (UID: "fc360947-b1e2-4ac0-8447-c7e886e036a0"). InnerVolumeSpecName "kube-api-access-2sb4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.695735 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.695766 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sb4j\" (UniqueName: \"kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j\") on node \"crc\" DevicePath \"\"" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.714582 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc360947-b1e2-4ac0-8447-c7e886e036a0" (UID: "fc360947-b1e2-4ac0-8447-c7e886e036a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.796913 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.903936 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc360947-b1e2-4ac0-8447-c7e886e036a0" containerID="e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c" exitCode=0 Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.903977 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerDied","Data":"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c"} Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.904010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerDied","Data":"b1094c83d369f3971efc0ed7014799eaab5e95ee8bff0716f242a67ae96948ae"} Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.904010 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.904027 4962 scope.go:117] "RemoveContainer" containerID="e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.927272 4962 scope.go:117] "RemoveContainer" containerID="221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.939628 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.948928 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.963879 4962 scope.go:117] "RemoveContainer" containerID="e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.993208 4962 scope.go:117] "RemoveContainer" containerID="e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c" Feb 20 11:22:01 crc kubenswrapper[4962]: E0220 11:22:01.993643 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c\": container with ID starting with e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c not found: ID does not exist" containerID="e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.993711 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c"} err="failed to get container status \"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c\": rpc error: code = NotFound desc = could not find container \"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c\": container with ID starting with e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c not found: ID does not exist" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.993735 4962 scope.go:117] "RemoveContainer" containerID="221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38" Feb 20 11:22:01 crc kubenswrapper[4962]: E0220 11:22:01.994476 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38\": container with ID starting with 221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38 not found: ID does not exist" containerID="221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.994507 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38"} err="failed to get container status \"221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38\": rpc error: code = NotFound desc = could not find container \"221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38\": container with ID starting with 221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38 not found: ID does not exist" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.994534 4962 scope.go:117] "RemoveContainer" containerID="e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225" Feb 20 11:22:01 crc kubenswrapper[4962]: E0220 11:22:01.994862 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225\": container with ID starting with e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225 not found: ID does not exist" containerID="e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.994907 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225"} err="failed to get container status \"e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225\": rpc error: code = NotFound desc = could not find container \"e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225\": container with ID starting with e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225 not found: ID does not exist" Feb 20 11:22:03 crc kubenswrapper[4962]: I0220 11:22:03.154277 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc360947-b1e2-4ac0-8447-c7e886e036a0" path="/var/lib/kubelet/pods/fc360947-b1e2-4ac0-8447-c7e886e036a0/volumes" Feb 20 11:22:16 crc kubenswrapper[4962]: I0220 11:22:16.139876 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:22:16 crc kubenswrapper[4962]: E0220 11:22:16.140942 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:22:17 crc kubenswrapper[4962]: I0220 11:22:17.709336 4962 scope.go:117] "RemoveContainer" containerID="b4d681ba38ab243d90aebb0cd3f5e0d964a3b05eff1c6a9189b417f8bc499f51" Feb 20 11:22:17 crc kubenswrapper[4962]: I0220 11:22:17.766705 4962 scope.go:117] "RemoveContainer" containerID="cc4df7336e0c93c42160fd50ab2c566dcfda96d76ab5ecee6e26256c4e0e35c7" Feb 20 11:22:17 crc kubenswrapper[4962]: I0220 11:22:17.809134 4962 scope.go:117] "RemoveContainer" containerID="1c894dfd10e3ea0973c4a9f38552b1c9dae05591935995fa639e6204d2604dcb" Feb 20 11:22:29 crc kubenswrapper[4962]: I0220 11:22:29.150147 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:22:29 crc kubenswrapper[4962]: E0220 11:22:29.150937 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:22:42 crc kubenswrapper[4962]: I0220 11:22:42.139848 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:22:43 crc kubenswrapper[4962]: I0220 11:22:43.318062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"d003c7c35366337d0b715af71a425c9e815e66375dbb460fb9c9b8c7941f2e26"} Feb 20 11:22:44 crc kubenswrapper[4962]: I0220 11:22:44.074013 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2mzpw"] Feb 20 11:22:44 crc kubenswrapper[4962]: I0220 11:22:44.081990 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2mzpw"] Feb 20 11:22:45 crc kubenswrapper[4962]: I0220 11:22:45.148924 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae4e019-31b7-4826-a5ef-042faba6034d" path="/var/lib/kubelet/pods/7ae4e019-31b7-4826-a5ef-042faba6034d/volumes" Feb 20 11:22:56 crc kubenswrapper[4962]: I0220 11:22:56.441151 4962 generic.go:334] "Generic (PLEG): container finished" podID="fd1e5da5-b553-419b-b874-baa4d0b09f1d" containerID="68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592" exitCode=0 Feb 20 11:22:56 crc kubenswrapper[4962]: I0220 11:22:56.441275 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" event={"ID":"fd1e5da5-b553-419b-b874-baa4d0b09f1d","Type":"ContainerDied","Data":"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592"} Feb 20 11:22:56 crc kubenswrapper[4962]: I0220 11:22:56.442539 4962 scope.go:117] "RemoveContainer" containerID="68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592" Feb 20 11:22:56 crc kubenswrapper[4962]: I0220 11:22:56.742334 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4hjrx_must-gather-vfvhw_fd1e5da5-b553-419b-b874-baa4d0b09f1d/gather/0.log" Feb 20 11:23:03 crc kubenswrapper[4962]: I0220 11:23:03.906545 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4hjrx/must-gather-vfvhw"] Feb 20 11:23:03 crc kubenswrapper[4962]: I0220 11:23:03.907641 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" podUID="fd1e5da5-b553-419b-b874-baa4d0b09f1d" containerName="copy" containerID="cri-o://51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be" gracePeriod=2 Feb 20 11:23:03 crc kubenswrapper[4962]: I0220 11:23:03.915664 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4hjrx/must-gather-vfvhw"] Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.363992 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4hjrx_must-gather-vfvhw_fd1e5da5-b553-419b-b874-baa4d0b09f1d/copy/0.log" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.364656 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.461063 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xc68\" (UniqueName: \"kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68\") pod \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.461530 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output\") pod \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.468020 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68" (OuterVolumeSpecName: "kube-api-access-5xc68") pod "fd1e5da5-b553-419b-b874-baa4d0b09f1d" (UID: "fd1e5da5-b553-419b-b874-baa4d0b09f1d"). InnerVolumeSpecName "kube-api-access-5xc68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.527632 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4hjrx_must-gather-vfvhw_fd1e5da5-b553-419b-b874-baa4d0b09f1d/copy/0.log" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.528111 4962 generic.go:334] "Generic (PLEG): container finished" podID="fd1e5da5-b553-419b-b874-baa4d0b09f1d" containerID="51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be" exitCode=143 Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.528177 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.528188 4962 scope.go:117] "RemoveContainer" containerID="51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.562866 4962 scope.go:117] "RemoveContainer" containerID="68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.563471 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xc68\" (UniqueName: \"kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68\") on node \"crc\" DevicePath \"\"" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.595980 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "fd1e5da5-b553-419b-b874-baa4d0b09f1d" (UID: "fd1e5da5-b553-419b-b874-baa4d0b09f1d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.637887 4962 scope.go:117] "RemoveContainer" containerID="51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be" Feb 20 11:23:04 crc kubenswrapper[4962]: E0220 11:23:04.638649 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be\": container with ID starting with 51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be not found: ID does not exist" containerID="51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.638703 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be"} err="failed to get container status \"51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be\": rpc error: code = NotFound desc = could not find container \"51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be\": container with ID starting with 51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be not found: ID does not exist" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.638734 4962 scope.go:117] "RemoveContainer" containerID="68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592" Feb 20 11:23:04 crc kubenswrapper[4962]: E0220 11:23:04.639428 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592\": container with ID starting with 68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592 not found: ID does not exist" containerID="68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.639460 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592"} err="failed to get container status \"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592\": rpc error: code = NotFound desc = could not find container \"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592\": container with ID starting with 68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592 not found: ID does not exist" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.664662 4962 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 20 11:23:05 crc kubenswrapper[4962]: I0220 11:23:05.147689 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd1e5da5-b553-419b-b874-baa4d0b09f1d" path="/var/lib/kubelet/pods/fd1e5da5-b553-419b-b874-baa4d0b09f1d/volumes" Feb 20 11:23:17 crc kubenswrapper[4962]: I0220 11:23:17.908128 4962 scope.go:117] "RemoveContainer" containerID="8b9ab6691837647b0967e622ecfd0e62f3ce7907b1cef344ccbbdd1bcb192e5e" Feb 20 11:25:11 crc kubenswrapper[4962]: I0220 11:25:11.508031 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:25:11 crc kubenswrapper[4962]: I0220 11:25:11.508538 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:25:41 crc kubenswrapper[4962]: I0220 11:25:41.508578 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:25:41 crc kubenswrapper[4962]: I0220 11:25:41.509448 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:26:11 crc kubenswrapper[4962]: I0220 11:26:11.508900 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:26:11 crc kubenswrapper[4962]: I0220 11:26:11.509689 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:26:11 crc kubenswrapper[4962]: I0220 11:26:11.509760 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 11:26:11 crc kubenswrapper[4962]: I0220 11:26:11.510832 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d003c7c35366337d0b715af71a425c9e815e66375dbb460fb9c9b8c7941f2e26"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 11:26:11 crc kubenswrapper[4962]: I0220 11:26:11.510931 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://d003c7c35366337d0b715af71a425c9e815e66375dbb460fb9c9b8c7941f2e26" gracePeriod=600 Feb 20 11:26:12 crc kubenswrapper[4962]: I0220 11:26:12.259844 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="d003c7c35366337d0b715af71a425c9e815e66375dbb460fb9c9b8c7941f2e26" exitCode=0 Feb 20 11:26:12 crc kubenswrapper[4962]: I0220 11:26:12.259894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"d003c7c35366337d0b715af71a425c9e815e66375dbb460fb9c9b8c7941f2e26"} Feb 20 11:26:12 crc kubenswrapper[4962]: I0220 11:26:12.260481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"0b314312dcb2f11e67f59432c45683ba13a50c0ae9ba6a5dd639de24db881b08"} Feb 20 11:26:12 crc kubenswrapper[4962]: I0220 11:26:12.260517 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae"